![]() Method and apparatus for volumetric image navigation
专利摘要:
Method and Apparatus for Volumetric Image Navigation A surgical navigation system has a computer with a memory and display connected to a surgical instrument or pointer and position tracking system, so that the location and orientation of the pointer are tracked in real time and conveyed to the computer. The computer memory is loaded with data from an MRI, CT, or other volumetric scan of a patient, and this data is utilized to dynamically display 3-dimensional perspective images in real time of the patient's anatomy from the viewpoint of the pointer. The images are segmented and displayed in color to highlight selected anatomical features and to allow the viewer to see beyond obscuring surfaces and structures. The displayed image tracks the movement of the instrument during surgical procedures. The instrument may include an imaging device such as an endoscope or ultrasound transducer, and the system displays also the image for this device from the same viewpoint, and enables the two images to be fused so that a combined image is displayed. The system is adapted for easy and convenient operating room use during surgical procedures. 公开号:US20010007919A1 申请号:US09/777,777 申请日:2001-02-05 公开日:2001-07-12 发明作者:Ramin Shahidi 申请人:Ramin Shahidi; IPC主号:A61B5-06
专利说明:
[0001] 1. Field of the Invention [0001] [0002] This invention pertains generally to systems and methods for generating images of three dimensional objects for navigation purposes, and more particularly to systems and methods for generating such images in medical and surgical applications. [0002] [0003] 2. Description of the Background Art [0003] [0004] Precise imaging of portions of the anatomy is an increasingly important technique in the medical and surgical fields. In order to lessen the trauma to a patient caused by invasive surgery, techniques have been developed for performing surgical procedures within the body through small incisions with minimal invasion. These procedures generally require the surgeon to operate on portions of the anatomy that are not directly visible, or can be seen only with difficulty. Furthermore, some parts of the body contain extremely complex or small structures and it is necessary to enhance the visibility of these structures to enable the surgeon to perform more delicate procedures. In addition, planning such procedures requires the evaluation of the location and orientation of these structures within the body in order to determine the optimal surgical trajectory. [0004] [0005] New diagnostic techniques have been developed in recent years to obtain images of internal anatomical structures. These techniques offer great advantages in comparison with the, traditional X-ray methods. Newer techniques include microimpulse radar (MIR), computer tomography (CT) scans, magnetic resonance imaging (MRI), positron emission tomography (PET), ultrasound (US) scans, and a variety of other techniques. Each of these methods has advantages and drawbacks in comparison with other techniques. For example, the MRI technique is useful for generating three-dimensional images, but it is only practical for certain types of tissue, while CT scans are useful for generating images of other anatomical structures. Ultrasound scanning, in contrast, is a relatively rapid procedure; however it is limited in its accuracy and signal-to-noise ratio. [0005] [0006] The imaging problem is especially acute in the field of neurosurgery, which involves performing delicate surgical procedures inside the skull of the patient. The above techniques have improved the surgeon's ability to locate precisely various anatomical features from images of structures within the skull. However this has only limited usefulness in the operating room setting, since it is necessary to match what the surgeon sees on the 2D image with the actual 3D patient on the operating table. The neurosurgeon is still compelled to rely to a considerable extent on his or her knowledge of human anatomy. [0006] [0007] The stereotactic technique was developed many years ago to address this problem. In stereotactic surgery, a frame of reference is attached to the patient's head which provides reference points for the diagnostic images. The device further includes guides for channeling the surgical tool along a desired trajectory to the target lesion within the brain. This method is cumbersome and has the drawback that the surgeon cannot actually see the structures through which the trajectory is passing. There is always the risk of damage to obstacles in the path of the incision, such as portions of the vascular or ventricular system. In essence, with previous neurosurgical techniques the surgeon is in the position much like that of a captain piloting a vessel traveling in heavy fog through waters that have many hazards, such as shoals, reefs, outcroppings of rocks, icebergs, etc. Even though the captain may have a very good map of these hazards, nevertheless there is the constant problem of keeping track of the precise location of the vessel on the map. In the same way, the neurosurgeon having an accurate image scan showing the structures within the brain must still be able to precisely locate where the actual surgical trajectory lies on the image in order to navigate successfully to the target location. In the operating room setting, it is further necessary that this correlation can be carried out without interfering with the numerous other activities that must be performed by the surgeon. [0007] [0008] The navigation problem has been addressed in U.S. Pat. No. 5,383,454, issued Jan. 24, 1995 (Bucholz). This patent describes a system for indicating the position of a surgical probe within a head on an image of the head. The system utilizes a stereotactic frame to provide reference points, and to provide means for measuring the position of the probe tip relative to these reference points. This information is converted into an image by means of a computer. [0008] [0009] U.S. Pat. No. 5,230,623, issued Jul. 27, 1993 (Guthrie), discloses an operating pointer whose position can be detected and read out on a computer and associated graphics display. The pointer can also be used as a “3D mouse” to enable the surgeon to control the operation of the computer without releasing the pointer. [0009] [0010] U.S. Pat. No. 5,617,857, issued Apr. 8, 1997 (Chader et al.) sets forth an imaging system and method for interactively tracking the position of a medical instrument by means of a position-detecting system. The pointer includes small light-emitting diodes (LED), and a stationary array of radiation sensors is provided for detecting pulses emitted by these LED's and utilizing this information to ascertain dynamically the position of the pointer. Reference is made also to U.S. Pat. No. 5,622,170, issued Apr. 22, 1997 (Schulz), which describes a similar system connected to a computer display for displaying the position of an invasive surgical probe relative to a model image of the object being probed (such as a brain). [0010] [0011] U.S. Pat. No. 5,531,227, issued Jul. 2, 1996 (Schneider) explicitly addresses the problem recognized in many other references that it is desirable to provide a real time display of a surgical probe as it navigates through the brain. This patent describes a system for providing images along the line of sight of the surgeon in a dynamic real-time fashion. In this system the images that are displayed are resliced images from a three-dimensional data reconstruction which are sections or slices orthogonal to the line of sight, taken at various positions along this line specified by the user. Thus, while the viewpoint for the line of sight is always external to the body, the sectional planes that are used to define the virtual images may constitute various slices through the body chosen by the surgeon. These images may be superimposed on actual images obtained by an image recording device directed along the line of sight such as a video camera attached to the surgeon's head, and the composite images may be displayed. [0011] [0012] The systems described above attempt to address the navigation problem in various ways, and they all have the common drawback of requiring a certain level of abstract visualization by the surgeon during an operating room procedure. When the surgeon is proceeding through the brain toward a target tumor or lesion, it is desirable to be fully aware of all of the structures around the surgical trajectory. With previous systems the displays that are presented do not provide all of this information in a single convenient real-time display, and they require the viewer to piece together and re-orient the displayed information to obtain a mental picture of the surrounding structures. These are serious practical disadvantages in an operating room setting. What is absent from previous systems is a 3D display that shows, in a real-time view, the various structures looking ahead from the surgical probe along a line of sight into the brain in three and two dimensions, including structures hidden by other features. [0012] SUMMARY OF THE INVENTION [0013] The present invention provides an improved system and method for displaying 3D images of anatomical structures in real time during surgery to enable the surgeon to navigate through these structures during the performance of surgical procedures. This system is also useful in planning of surgical procedures. The system includes a computer with a display and input devices such as a keyboard and mouse. The system also includes a position tracking system that is connected both to the computer and also to the surgical probes or other instruments that are used by the surgeon. The position tracking system provides continual real time data to the computer indicating the location and orientation of the surgical instrument in use. The computer further includes a memory containing patient data produced by imaging scans, such as CT or MRI scans, from which 2-dimensional and 3-dimensional images of the anatomical structure may be generated. Means are provided for registration of these images with respect to the patient. [0013] [0014] The computer memory is further provided with programs that control the generation of these anatomical images. These programs include software for segmentation of the scan images to identify various types of structures and tissues, as well as the reconstruction of 2D and 3D images from the scan data. This software allows these images to be displayed with various magnifications and orientations, and with various sectional views produced by slice planes in various locations and orientations, all controlled by the surgeon. [0014] [0015] This image-generating software has the important feature that it produces 3D images that are perspective views of the anatomical structures, with user-controlled means for varying the viewing orientation and location, and also varying the displayed transparency or opacity of various types of tissues, structures, and surfaces in the viewed region of interest. This enables the user to effectively “see through” surfaces and structures in the line of sight of the image to reveal other structures that would otherwise be hidden in that particular view. [0015] [0016] Further, the images are generated from the viewpoint of the surgical probe or instrument that is in use, looking from the tip of the instrument along its longitudinal axis. Thus, when an invasive surgical instrument such as a scalpel or forceps s inserted into an incision in the body, the display provides a three dimensional perspective view of anatomical structures from a viewpoint inside the body. These images are all generated in real time “on the fly”. Thus, as the instrument is moved or rotated, the position tracking system continually provides data to the computer indicating the location and orientation of the instrument, and the displayed image is continually updated to show the structures toward which the instrument is pointing. [0016] [0017] In addition, for probes or instruments being used that are capable themselves of generating images, such as ultrasound probes, endoscopes, or surgical microscopes, the system provides means for integrating these images with those generated from the scan data. The software enables the user to overlay the “actual images” generated by these instruments with the “virtual images” generated from the scan data. [0017] [0018] It is an object of this invention to provide a system and method for generating an image in three dimensional perspective of anatomical structures encountered by a surgeon during the performance of surgical procedures. [0018] [0019] A second object of this invention is to provide a system and method for generating such an image with user-controlled means for varying the location and orientation of the viewpoint corresponding to the image. [0019] [0020] Another object of this invention is to provide a system and method for generating such an image with user-controlled means for varying the opacity of structures and surfaces in the viewed region of interest, so that the displayed image shows structures and features that would be otherwise hidden in a normal view. [0020] [0021] Yet another object of this invention is to provide a system and method for generating such an image with a viewpoint located at the tip of the instrument being used by the surgeon in the direction along the longitudinal axis of the instrument. [0021] [0022] Still another object of this invention is to provide a system and method for generating such an image in real time, such that the displayed image continually corresponds to the position of the instrument being used by the surgeon. [0022] [0023] Yet a further object of this invention is to provide a system and method for comparing and combining such an image with the image produced by an image-generating instrument being used by the surgeon. [0023] [0024] These and other objects, advantages, characteristics and features of the invention may be better understood by examining the following drawings together with the detailed description of the preferred embodiments. [0024] BRIEF DESCRIPTION OF THE DRAWINGS [0025] FIG. 1 is a schematic perspective drawing of the apparatus of the present invention in operating room use during the performance of neurosurgical procedures. [0025] [0026] FIG. 2 is a schematic block diagram of the computer system and optical tracking system of the present invention. [0026] [0027] FIG. 3 is a schematic block diagram of the navigation protocol using pre-operative data that is followed in carrying out the method of the present invention. [0027] [0028] FIG. 4 is a schematic block diagram of the navigation protocol using ultrasound intra-operative data that is followed in carrying out the method of the present invention. [0028] [0029] FIG. 5 is a schematic block diagram of the endoscopic protocol that is followed in carrying out the method of the present invention. [0029] [0030] FIG. 6 is a schematic flow chart of the pre-operative computer program that implements the pre-operative protocol of the present invention. [0030] [0031] FIG. 7 is a schematic flow chart of the intra-operative ultrasound computer program that implements the ultrasound protocol of the present invention. [0031] [0032] FIG. 8 is a schematic flow chart of the intra-operative endoscope computer program that implements the endoscope protocol of the present invention. [0032] [0033] FIG. 9 is a drawing of a display generated according to the present invention, showing axial, coronal, and sagittal views of a head, together with a three-dimensional perspective view of the head taken from an exterior viewpoint. [0033] [0034] FIG. 10 is a drawing of a display generated according to the present invention, showing sectional axial, coronal, and sagittal views of a head, together with a three-dimensional perspective view of the head taken from an interior viewpoint. [0034] [0035] FIG. 11[0035] a is a drawing of a plastic model of a human skull and a surgical probe that has been used to demonstrate the present invention. [0036] FIG. 11[0036] b is another drawing of the model skull of FIG. 11a, with the top of the skull removed to show model internal structures for demonstration purposes. [0037] FIG. 12 is a simplified reproduction of two displays produced by the present invention for the model skull shown in FIGS. 11[0037] a, 11 b. [0038] FIG. 13 is a simplified reproduction of two further displays of the invention for the skull in FIGS. 11[0038] a, 11 b. [0039] FIG. 14 is a reproduction of a composite display produced by the present invention for an actual human head. [0039] DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS [0040] FIG. 1 shows the apparatus of the invention as used in performing or planning a neurosurgery operation. In this drawing the patient's head [0040] 112, has a tumor or lesion 117, which is the target object of the operation. Fiducial markers 113, 114 are attached to the head to enable registration of images generated by previously obtained scan data according to techniques familiar to persons of ordinary skill in the relevant art. A surgical probe or instrument 109 held by the surgeon is directed toward the tissues of interest. A computer 101 is connected to user input devices including a keyboard 103 and mouse 104, and a video display device 102 which is preferably a color monitor. The display device 102 is located such that it can be easily viewed by the surgeon during an operation, and the user input devices 103 and 104 are placed within easy reach to facilitate use during the surgery. The apparatus further includes a position tracking system, which is preferably an optical tracking system (hereafter “OTS”) having a sensing unit 105 mounted overhead in view of the operating table scene, and at least two light emitting diodes (LED's) 110, 111 mounted on the surgical instrument 109. These LED's preferably emit continuous streams of pulsed infrared signals which are sensed by a plurality of infrared detectors 106, 107 108 mounted in the sensing unit 105 in view of the surgical instrument 109. The instrument 109 and the sensing unit 105 are both connected to the computer 101, which controls the timing and synchronization of the pulse emissions by the LED's and the recording and processing of the infrared signals received by the detectors 106-108. The OTS further includes software for processing these signals to generate data indicating the location and orientation of the instrument 109. The OTS generates the position detecting data on a real time continuous basis, so that as the surgical instrument 109 is moved, its position and orientation are continually tracked and recorded by the sensing unit 105 in the computer 101. The OTS may be preferably of the type known as the “FlashPoint 3-D Optical Localizer”, which is commercially available from Image Guided Technologies of Boulder, Colo., similar to the systems described in U.S. Pat. Nos. 5,617,857 (Chader, et al.) and 5,622,170 (Schulz) discussed previously. However the invention is not limited to this particular OTS, and other position tracking systems, such as sonic position detecting systems, may also be utilized. [0041] As illustrated in FIG. 1, the surgical instrument [0041] 109 is elongated in shape, having a longitudinal axis and tip 115 pointing toward the tissues of interest. The instrument may be an endoscope having a conical field of view 116 that is indicated by dotted lines in FIG. 1. The instrument shown in the Figure is held at a position external to the patient's head. If an incision 118 has been made into the skull, the instrument may be inserted through the incision; this alternative position is shown by dotted lines in FIG. 1. In both positions the instrument is held so that there is an unobstructed line of sight between the LED's 110, 111 and the sensing unit 105 in endoscopic and other optical viewing applications, the instrument may include a laser targeting system (not shown in the drawings) to illuminate and highlight the region under examination. [0042] FIG. 2 shows a schematic block diagram of the computer system connected to the position tracking system. The computer [0042] 101 includes a central processing unit (CPU) 201 communicative with a memory 202, the video display 102, keyboard and mouse 103, 104, optical detectors 106-108, and the LED's mounted on the surgical instrument 109. The computer memory contains software means for operating and controlling the position tracking system. In an alternative preferred embodiment, the OTS components 105-109 may be connected to and controlled by a separate computer or controller which is connected to the computer 101 and provides continual data indicating the position and orientation of the surgical instrument 109. [0043] The above apparatus is operated to carry out surgical protocols that are illustrated schematically in FIGS. [0043] 3-5. FIG. 3 is a schematic block diagram of the protocol for handling pre-operative data (“pre-op protocol”) to generate images during surgery according to the present invention. It is assumed that three-dimensional image data of the patient's head have been previously obtained from one or more of the techniques that are known to persons of ordinary skill in the medical imaging arts. Preferably these data are acquired from CT, MIR and/or MRI scan techniques to provide images with improved accuracy and detail, compared to ultrasound scan data for example. The scan data are loaded and stored 301 into the computer memory 202 through additional input means such as disk drives or tape drives, not shown in the drawings. [0044] The patient data is registered [0044] 302 according to one of the generally known techniques. This procedure may be either a three-dimensional registration of the entire data set, or a slice-by-slice sequence of two-dimensional registrations. Following the three-dimensional registration, the image is reconstructed 303 in memory, using volumetric or surface rendering to produce an array of 3-dimensional voxel data. Segmentation 304 is then carried out on these data to distinguish various anatomical features, such as different types of material in the head (bone, brain tissue, vascular and ventricular structures, etc.) and the location of surfaces, using one or more of known segmentation techniques. Preferably the segmentation process includes assigning different display colors to different types of structures to facilitate their identification and distinction in a color video display. For example, the vascular system may be displayed in red, the ventricular system may be shown in blue, bones may be colored brown, and so on. In a preferred embodiment these assignments may be varied by the user by means of the keyboard 103 or mouse 104. Also in a preferred embodiment the display opacities may be varied by the user by means of the keyboard 103, mouse 104, or other input device (such as a voice-activated device) to further facilitate their identification and distinction of hidden or obstructed features in the video display. In an alternative protocol in which 2-dimensional registration is carried out, segmentation 309 can be done for each 2-dimensional image sample, and the 3-dimensional data are then reconstructed 310 from the segmented data slices. This alternative protocol is shown by dotted lines in the Figure. [0045] Referring still to FIG. 3, the next phase of the pre-op protocol is to determine the location and orientation of the view vector [0045] 305 to define the image to be displayed. This view vector is obtained by querying the OTS to ascertain the current location and orientation of the surgical instrument 109. With this information, the three-dimensional scan data is then manipulated 306 to position and orient the resulting three-dimensional perspective view and to define cutting planes and reference markers in the displayed image indicating and clarifying this view. The manipulated three-dimensional perspective image is then displayed 307 on the video display 102. In addition, other two-dimensional images, such as 2D sectional views for any cutting planes, are preferably also displayed along with the 3D perspective display for purposes of elucidation. [0046] Finally, the pre-op protocol is a continuing loop process in which the OTS is repeatedly queried [0046] 308 for changes in the location of the view vector corresponding to changes in the position and orientation of the surgical instrument 109. Thus the displayed images are continually being updated during the surgical procedure, and the resulting displays are constantly refreshed in real time. The image data are also stored or buffered and made available for further use 311 according to subsequent protocols. [0047] The surgical instrument [0047] 109 may include an ultrasound transducer located at the tip 115, which itself scans and detects ultrasound imaging data when placed in contact with the patient's head. FIG. 4 is a schematic block diagram showing the intra-operative (“intra-op”) ultrasound (“US”) protocol for handling the US image data during surgery. Typically the ultrasound transducer is a phased focusing array which generates data from a planar fan-shaped sector of the anatomical region of interest, where the central axis of the transducer lies in the plane of the scan sector which, in this context, is collinear with the longitudinal axis of the surgical instrument 109. By rotating the instrument and transducer about this axis, US scan data is collected and stored 401 for a cone-shaped volume in the region of interest. This cone defines the “field of view” of the transducer scan. [0048] The location and orientation of the transducer is tracked and determined [0048] 402 by the OTS, and the US data is used to reconstruct 403 three-dimensional intra-op image data for the region of interest. This data is manipulated 404 in a way analogous to the manipulation 306 of the pre-op data, and then used to generate three-dimensional images 405, together with any desired corresponding two-dimensional images of the ultrasound data. These intra-op images are fused 406 with the pre-op images generated by the pre-op protocol 311, and the composite images are further displayed. Finally, the OTS is continually strobed 407, and the ultrasound images are constantly refreshed. [0049] FIG. 5 is a schematic block diagram of the intra-op protocol in which an endoscope is place at the tip [0049] 115 of the surgical instrument 109. This protocol is also applicable for procedures utilizing a surgical microscope in place of the endoscope. Image data is acquired 501, using a CCD camera or other known technique, representing a 2-dimensional image in a plane orthogonal to the line of sight of the endoscope or microscope, which in this context is the longitudinal axis of the surgical instrument 109. The location and orientation of the instrument is tracked and determined 502 by the OTS, and analog-to-digital (“A/D”) conversion 503 is carried out on the data. The location of the viewpoint is determined 504 from the OTS data, and the endoscope or microscope image data is manipulated 505 to generate the desired image 506 for display. These intra-op images are fused 508 with the pre-op images generated by the pre-op protocol 311, and the composite images are further displayed. Finally, the OTS is continually strobed 507, and the ultrasound images are constantly refreshed. [0050] The foregoing protocols are implemented by program modules stored in the memory [0050] 202 of the computer 101. FIG. 6 is a schematic block diagram of a flow chart for a program that implements the pre-op protocol. The program starts 601 by causing the computer to receive and load 602 previously obtained scan data for the patient, such as MRI or CT data. The computer further reads data from the OTS 603 to register the scanned patient data 604. For 3D volumetric rendering, the scanned data is used to reconstruct image data 605 in three dimensions, and segmentation 606 is carried out on this reconstruction. In an alternative embodiment, shown by dotted lines in the Figure, segmentation is carried out on 2D slices 615, and these segmented slices are then reconstructed into the full 3D image data. [0051] The program next reads input data from the keyboard [0051] 103 or mouse 104 to enable the user to select a field of view for image displays 607. The image data is then manipulated and transformed 608 to generate the requested view, along with any selected reference markers, material opacities, colors, and other options presented to the user by the program. In addition, the user may request a 3D display of the entire head, together with a superimposed cone showing the field of view for an endoscope, microscope, ultrasound transducer, or other viewing device being used during the surgery. The resulting manipulated image is then displayed 609 preferably in color on the video display 102. The computer next reads the OTS data 610 and determines 611 whether the surgical instrument has moved. If so, program control returns to the selection of a new field of view 607 and the successive operations 608-610 shown in FIG. 6. If the position of the instrument has not changed, the displayed image is stored 612, refreshing any previously stored display image. The program further looks for requests from the user 613 whether to discontinue operation, and it there are no such requests, the operations 611 and 612 are repeated. Thus the computer remains in a loop of operations until the user requests termination 614. [0052] FIG. 7 is a schematic block diagram of a flow chart for a program that implements the ultrasound intra-op protocol. The program starts [0052] 701 by causing the computer to receive and load the data from a US transducer at the tip 115 of the surgical instrument 109. Such data is produced normally using polar or spherical coordinates to specify locations in the region of interest, and the program converts 703 this data preferably to Cartesian coordinates. Next, OTS data is read 704 to determine the position and orientation of the surgical instrument 109, and US data from the aggregation of aligned data slices is utilized to reconstruct 3D image data 705 representing the US scan data. This image data is manipulated and transformed 706 by the program in a manner similar to the manipulation 608 of the pre-op data 608, and the resulting image is displayed 707. [0053] Similarly to the pre-op program shown in FIG. 6, the OTS is queried [0053] 709 to determine whether the surgical instrument has moved 713, and if so a new US display image is constructed. In a preferred embodiment, the program queries the user 716 whether to carry out another US scan of the region of interest. If so, program control returns to the operation 702 in FIG. 7 and US data is obtained by the US transducer. If another scan is not requested 716, the program returns to operation 705 and a new 3D image is reconstructed from the present US scan data. [0054] If the OTS query [0054] 709 determines that the surgical instrument has not moved since the last query, the US image is fused 710 with the pre-op image obtained by the program shown in FIG. 6, and the combined image is displayed 711. The OTS is again queried 712 to determine 713 whether the surgical instrument has moved. If so, the program returns to the new scan user query 716. Otherwise the program further looks for requests from the user 714 whether to discontinue operation, and if there are no such requests, the operation 713 is repeated. Thus the computer remains in a loop of operations until the user requests termination 715, similarly to the pre-op program of FIG. 6. [0055] The endoscope/microscope intra-op protocol is implemented preferably by the endoscope intra-op program having a flow chart shown in schematic block diagram form in FIG. 8. Upon starting [0055] 801, the program causes the computer to receive and load image data from the endoscope 802. This data is digitized 803 and preferably displayed 804 on the video display 102. The OTS is queried 805 [0056] to receive information determining the location and orientation of the endoscope [0056] 806. Using this information, the pre-op data obtained by the pre-op program illustrated in FIG. 6 is retrieved 807, and utilized to reconstruct a 3-dimensional virtual image 808 from the viewpoint of the endoscope. This image is displayed 809, in a manner similar to the 3D display of images by the pre-op program illustrated in FIG. 6. This image is fused 810 with the endoscope image displayed in operation 804, and the combined image is also displayed 811. The OTS is then strobed 812 to determine 813 whether the endoscope has moved since the last query, and if so, program control returns to the operation 802 which refreshes the image data received by the endoscope. Otherwise the program further looks for requests from the user 814 whether to discontinue operation, and if there are no such requests, the operation 813 is repeated. Thus the computer remains in a loop of operations until the user requests termination 815, similarly to the pre-op and intra-op programs of FIGS. 6 and 7. [0057] The foregoing program modules may be designed independently, and they can be configured also to run independently. Thus, the pre-op program may be completed, followed by running of either or both of the intra-op programs. Preferably, however, these programs operate in parallel during surgery so that the pre-op data images and intra-op data images are all continually refreshed as the operation proceeds. Known methods for parallel execution of programs may be utilized to accomplish this result. [0057] [0058] The above programs are carried out preferably on a computer [0058] 101 that is adapted for computer graphics applications. Suitable computers for these programs are commercially available from Silicon Graphics, Inc. of Mountain View, Calif. Graphics software modules for most of the individual image processing operations in the above programs are also available from Silicon Graphics, Inc. as well as other sources. [0059] Referring now to FIG. 9, the drawing shows a highly simplified sketch of a three-dimensional image display [0059] 901 obtained by the above system with the surgical probe 109 of FIG. 1 in the position illustrated, pointing toward the target lesion or tumor 117 inside the patient's head 112. The display 901 is a perspective view from the tip 115 of the probe 109. This display is continuously refreshed, so that as the probe 109 is moved the displayed image 901 immediately changes. It will be noted that, although the probe 109 is shown entirely outside the patient's head, the display 901 shows internal anatomical structures such as the brain and the target lesion 117. With the present system, the display characteristics can be adjusted in real time to emphasize or de-emphasize the internal structures. These structures may be distinguished by displays with different colors for different types of material. Also, the display opacity of the skin, skull, and brain tissue may be reduced to provide or emphasize further structural details regarding the target lesion 117. In short, the display 901 effectively equips the surgeon with “X-ray eyes” to look at hidden structures through obstructing surfaces and objects. With this display, the entire internal structure of the head may be examined and studied to plan a surgical. trajectory before any incision is made. Furthermore, if the surgical instrument 109 is a scalpel, the display 901 allows the surgeon to see any structures immediately behind a surface prior to the first incision. FIG. 9 shows also the conventional axial 902, coronal 903 and sagittal 904 2D displays for purposes of further clarification and elucidation of the region under examination. [0060] When the surgical instrument [0060] 109 is an endoscope or US transducer, the field of view 116 is also indicated in the display 901 by the quasi-circular image 905 indicating the intersection of the conical field of view 116 with the surface of the skin viewed by the endoscope 109. This conical field of view is also superimposed, for completeness, in the 2D displays 902-904. In a preferred embodiment, displays are also presented showing the actual image seen by the endoscope in the field of view 905, and the 3D perspective image for the same region in the field of view 905; these auxiliary displays are not shown in the drawings. Similar auxiliary displays are preferably included when the instrument 109 is an ultrasound transducer. [0061] After an incision [0061] 118 has been made in the patient's head, the endoscope may be inserted to provide an internal view of the target anatomy. Referring now to FIG. 10, the drawing shows a highly simplified sketch of a three-dimensional image display 1001 obtained by the above system with the endoscope 109 of FIG. 1 in the alternative position shown by the dotted lines, pointing toward the target lesion or tumor 117. The display 1001 has been manipulated to provide a three-dimensional sectional view with a cutting plane passing through the tip 115 of the endoscope 109 and orthogonal to its axis. Again, the endoscope field of view 905 is indicated in the display, and in a preferred embodiment auxiliary displays are also presented showing the actual image seen by the endoscope in the field of view 905, and the 3D perspective image for the same region in the field of view 905; these auxiliary displays are also not shown in FIG. 10. This Figure further preferably includes also the conventional axial 1002, coronal 1003 and sagittal 1004 2D displays for purposes of further clarification and elucidation. [0062] FIGS. 11[0062] a, 11 b, 12 and 13 illustrate further the three-dimensional displays that are produced by a preferred embodiment of the present invention. Referring to FIGS. 11a, 11 b, a plastic model of a skull has been fabricated having a base portion 1102 and a removable top portion 1101. These Figures show the model skull 1101, 1102 resting on a stand 1106. FIG. 11a also shows a pointer 1104 with LED's 1101 connected to an OTS (not shown in the drawing) that has been used to generate displays according to the invention. A plurality of holes 1103 in the top portion 1101 are provided, which allow the pointer 1104 to be extended into the interior of the skull. FIG. 11b shows the skull with the top portion 1103 removed. A plastic model of internal structures 1107 is fabricated inside the skull; these internal structures are easily recognizable geometric solids, as illustrated in the Figure. [0063] The skull of FIGS. 11[0063] a, 11 b has been scanned to generate “pre-op” image data, which has been utilized to produce the displays shown in FIGS. 12, 13. FIG. 12 is a composite of two displays 1201, 1202 of the skull with the pointer 1104 directed toward the skull from a top center external location, similar to the location and orientation of the pointer shown in FIG. 1. The display 1201 is a three-dimensional perspective view from this pointer location. The display 1202 is the same view, but with the display opacity of the skull material reduced. This reduced opacity makes the internal structure 1107 clearly visible, as shown in the Figure. During actual use, the system enables the surgeon to vary this opacity in real time to adjust the image so that both the skull structure and the internal structure are visible in the display in various proportions. [0064] It will be noted that the surface contour lines shown in the display [0064] 1201 are produced by the finite size of the rendering layers or voxels. These contour lines may be reduced by smoothing the data, or by reducing the sizes of the voxels or layers. [0065] FIG. 13 is a composite of two further displays with the pointer [0065] 1104 moved to extend through one of the openings 1103. Display 1302 is the view from the tip of the pointer inside the skull. Display 1301 is a view of the entire structure from outside the skull along the pointer axis; in other words, display 1302 is substantially a magnification of part of display 1301. Display 1301 shows the skull with a portion cut away by a cutting plane through the tip of the pointer, perpendicular to the pointer axis. Both of these displays clearly illustrate the perspective nature of the three-dimensional displays generated by the present invention. [0066] Finally, FIG. 14 is a simplified composite of displays generated by the system for an actual human head. Display [0066] 1401 is a perspective view of the entire head with a cutaway portion defined by orthogonal cutting planes as shown. This display also shows the field of view of an endoscope pointing toward the head along the intersection line of the two cutting planes, with the tip of the endoscope at the apex of the cone. Display 1402 shows the two-dimensional sectional view produced by the vertical cutting plane, and display 1403 shows the corresponding sectional view produced by the horizontal cutting plane. Furthermore, the images in displays 1402 and 1403 are also transformed (rotated and magnified) and superimposed on the three-dimensional image in display 1401. [0067] Both of these displays indicate also the intersection of the cutting planes with the conical field of view. Display [0067] 1404 is the actual image seen by the endoscope. Display 1405 is a virtual perspective view of the endoscope image reconstructed from scan data by volume rendering in accordance with the present invention. Display 1406 is a virtual perspective view of the image from the endoscope viewpoint with a narrower field of view, reconstructed from scan data by surface rendering in accordance with the present invention. This display 1406 would be used with a surgical probe in planning a surgical trajectory. Display 1407 is a magnification of 1406 (i.e. with a narrower field of view) showing the virtual image that would be seen through a microscope. Finally, display 1408 is a segmented three-dimensional perspective view of the entire head from the scan data utilizing surface rendering, and display 1409 is the same view with volume rendering. FIG. 14 illustrates the rich variety and versatility of the displays that are possible with the present system. All of these displays are presented to the surgeon in real time, simultaneously, and can be varied on line. [0068] It is apparent from the foregoing description that this invention provides improved means for navigating through the anatomy during actual surgical procedures. The system enables the surgeon to select and adjust the display with the same tool that is being utilized to perform the procedure, without requiring extra manual operations. Since the displays are provided immediately in real time, the imaging does not require any interruption of the procedure. In addition, the virtual images provided by this system are continuously correlated with the images that are obtained through conventional means. [0068] [0069] It will be further appreciated by persons of ordinary skill in the art that the invention is not limited in its application to neurosurgery, or any other kind of surgery or medical diagnostic applications. For example, systems implementing the invention can be implemented for actual nautical or aviation navigation utilizing information from satellites to obtain the “pre-op” scan data. The pointing device can be implemented by the vessel or aircraft itself, and the video display could be replaced by special imaging goggles or helmets. [0069] [0070] The foregoing description of the preferred embodiments of the invention has been presented solely for purposes of illustration and description, and is not exhaustive or limited to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The spirit and scope of the invention are to be defined by reference to the following claims, along with their full scope of equivalents. [0070]
权利要求:
Claims (16) [1" id="US-20010007919-A1-CLM-00001] 1. A method for generating an image of a three-dimensional object, said method comprising the steps of: acquiring volumetric first scan data for the object; utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data; selecting a viewpoint for displaying an image of said object based on said first virtual image data; manipulating said first virtual image data to generate a first three-dimensional perspective image of said object from said viewpoint; and displaying said first three-dimensional perspective image. [2" id="US-20010007919-A1-CLM-00002] 2. The method recited in claim 1 , wherein the step of utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data includes the step of segmenting said first virtual image data to distinguish selected features of said object. [3" id="US-20010007919-A1-CLM-00003] 3. The method recited in claim 1 , wherein the step of utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data includes the step of registration of said first virtual image data in relation to said object to determine the location of features of said object represented in said first virtual image data. [4" id="US-20010007919-A1-CLM-00004] 4. The method recited in claim 1 , further comprising, following the step of displaying said first three-dimensional perspective image, repeating any desired number of times the steps of: selecting another viewpoint for displaying an image of said object based on said first virtual image data; manipulating said first virtual image data to generate a first three-dimensional perspective image of said object from said other viewpoint; and displaying said first three-dimensional perspective image. [5" id="US-20010007919-A1-CLM-00005] 5. The method recited in claim 1 , further comprising the steps of: acquiring volumetric second scan data for the object; utilizing said second scan data to reconstruct second virtual image data representing structural information in said second scan data; determining the viewpoint for displaying an image of said object based on said second virtual image data to coincide with said viewpoint selected for displaying an image of said object based on said virtual image data; manipulating said second virtual image data to generate a second three-dimensional perspective image of said object from said viewpoint; and displaying said second three-dimensional perspective image. [6" id="US-20010007919-A1-CLM-00006] 6. The method recited in claim 5 , further comprising the step of fusing said second three-dimensional perspective image and said first three-dimensional perspective image to display a combined image. [7" id="US-20010007919-A1-CLM-00007] 7. The method recited in claim 1 , further comprising the steps of: acquiring second scan data for the object; utilizing said second scan data to reconstruct second virtual image data representing structural information in said second scan data; determining the viewpoint for displaying an image of said object based on said second virtual image data to coincide with said viewpoint selected for displaying an image of said object based on said virtual image data; manipulating said second virtual image data to generate a second image of said object from said viewpoint; and displaying said second image. [8" id="US-20010007919-A1-CLM-00008] 8. The method recited in claim 7 , further comprising the step of fusing said second image and said first three-dimensional perspective image to display a combined image. [9" id="US-20010007919-A1-CLM-00009] 9. Apparatus for generating an image of a three-dimensional object, comprising: a computer having a memory; display means communicative with said computer; input means communicative with said computer; pointer means communicative with said computer, said pointer means being movable by the user; and position tracking means communicative with said computer and said pointing means, such that said position tracking means detects the position and orientation of said pointer means continually and communicates said position and orientation to said computer; wherein said computer memory contains volumetric first scan data for the object, and further contains a program which causes said computer to perform the steps of: utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data; determining a viewpoint for displaying an image of said object based on said first virtual image data to be the position and orientation of said pointer means detected by said position tracking means; manipulating said first virtual image data to generate a first three-dimensional perspective image of said object from said viewpoint; and displaying said first three-dimensional perspective image. [10" id="US-20010007919-A1-CLM-00010] 10. Apparatus as recited in claim 9 , wherein the step of utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data includes the step of segmenting said first virtual image data to distinguish selected features of said object. [11" id="US-20010007919-A1-CLM-00011] 11. Apparatus as recited in claim 9 , wherein the step of utilizing said first scan data to reconstruct first virtual image data representing structural information in said first scan data includes the step of registration of said first virtual image data in relation to said object to determine the location of features of said object represented in said first virtual image data. [12" id="US-20010007919-A1-CLM-00012] 12. Apparatus as recited in claim 9 , wherein said program causes said computer, following the step of displaying said first three-dimensional perspective image, to perform and repeat any desired number of times the further steps of: selecting another viewpoint for displaying an image of said object based on said first virtual image data; manipulating said first virtual image data to generate a first three-dimensional perspective image of said object from said other viewpoint; and displaying said first three-dimensional perspective image. [13" id="US-20010007919-A1-CLM-00013] 13. Apparatus as recited in claim 9 , wherein said program causes said program performs the further steps of: acquiring volumetric second scan data for the object; utilizing said second scan data to reconstruct second virtual image data representing structural information in said second scan data; determining the viewpoint for displaying an image of said object based on said second virtual image data to coincide with said viewpoint selected for displaying an image of said object based on said virtual image data; manipulating said second virtual image data to generate a second three-dimensional perspective image of said object from said viewpoint; and displaying said second three-dimensional perspective image. [14" id="US-20010007919-A1-CLM-00014] 14. Apparatus as recited in claim 13 , wherein said program performs the further step of fusing said second three-dimensional perspective image and said first three-dimensional perspective image to display a combined image. [15" id="US-20010007919-A1-CLM-00015] 15. Apparatus as recited in claim 9 , wherein said program performs the further steps of: acquiring second scan data for the object; utilizing said second scan data to reconstruct second virtual image data representing structural information in said second scan data; determining the viewpoint for displaying an image of said object based on said second virtual image data to coincide with said viewpoint selected for displaying an image of said object based on said virtual image data; manipulating said second virtual image data to generate a second image of said object from said viewpoint; and displaying said second image. [16" id="US-20010007919-A1-CLM-00016] 16. Apparatus as recited in claim 15 , wherein said program performs the further step of fusing said second image and said first three-dimensional perspective image to display a combined image.
类似技术:
公开号 | 公开日 | 专利标题 US6591130B2|2003-07-08|Method of image-enhanced endoscopy at a patient site EP0999785A4|2007-04-25|Method and apparatus for volumetric image navigation US20190167354A1|2019-06-06|Systems, methods, apparatuses, and computer-readable media for image guided surgery US6850794B2|2005-02-01|Endoscopic targeting method and system US6019724A|2000-02-01|Method for ultrasound guidance during clinical procedures EP0908146B1|2004-01-14|Real-time image-guided placement of anchor devices EP1103229B1|2005-02-23|System and method for use with imaging devices to facilitate planning of interventional procedures US20070225553A1|2007-09-27|Systems and Methods for Intraoperative Targeting US20080243142A1|2008-10-02|Videotactic and audiotactic assisted surgical methods and procedures WO1996025881A1|1996-08-29|Method for ultrasound guidance during clinical procedures EP1011424A1|2000-06-28|Imaging device and method Hirschberg et al.1997|Incorporation of ultrasonic imaging in an optically coupled frameless stereotactic system Adams et al.1996|An optical navigator for brain surgery CN109833092A|2019-06-04|Internal navigation system and method Akatsuka et al.2000|Navigation system for neurosurgery with PC platform
同族专利:
公开号 | 公开日 US20010029333A1|2001-10-11| US7844320B2|2010-11-30| US6167296A|2000-12-26| US8116848B2|2012-02-14| US6591130B2|2003-07-08| US20010016684A1|2001-08-23| US20110040175A1|2011-02-17| US20030032878A1|2003-02-13| US6529758B2|2003-03-04|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US6585651B2|1999-04-20|2003-07-01|Synthes Ag Chur|Method and device for percutaneous determination of points associated with the surface of an organ| US20030134259A1|2001-11-20|2003-07-17|Tony Adams|Method of teaching through exposure to relevant perspective| US6694168B2|1998-06-22|2004-02-17|Synthes |Fiducial matching using fiducial implants| US20040034300A1|2002-08-19|2004-02-19|Laurent Verard|Method and apparatus for virtual endoscopy| US6725082B2|1999-03-17|2004-04-20|Synthes U.S.A.|System and method for ligament graft placement| US20050020909A1|2003-07-10|2005-01-27|Moctezuma De La Barrera Jose Luis|Display device for surgery and method for using the same| WO2006008300A1|2004-07-20|2006-01-26|Politecnico Di Milano|Apparatus for navigation and for fusion of ecographic and volumetric images of a patient which uses a combination of active and passive optical markers| US20060056674A1|2004-09-07|2006-03-16|Sari Lehtonen-Krause|Method and magnetic resonance system for generation of localizer slice images of an examination volume of a subject| EP1643444A1|2004-10-01|2006-04-05|MedCom Gesellschaft für medizinische Bildverarbeitung mbH|Registration of a medical ultrasound image with an image data from a 3D-scan, e.g. from Computed Tomographyor Magnetic Resonance Imaging | US20060079760A1|2004-09-06|2006-04-13|Sari Lehtonen-Krause|MR method and apparatus for determining coronal and sagittal image planes from an image data set of a shoulder joint| US20060098864A1|2002-12-03|2006-05-11|Ziel Jonathan M|Method and apparatus to display 3d rendered ultrasound data on an ultrasound cart in stereovision| WO2006071002A1|2004-12-30|2006-07-06|Korea Electrotechnology Research Institute|X-ray computed tomography apparatus to acquire the tomography and three-dimension surface image| WO2007128377A1|2006-05-04|2007-11-15|Nassir Navab|Virtual penetrating mirror device for visualizing virtual objects in endoscopic applications| EP1857834A2|2006-05-16|2007-11-21|Medison Co., Ltd.|Ultrasound system for fusing an ultrasound image and an external medical image| US20080091106A1|2006-10-17|2008-04-17|Medison Co., Ltd.|Ultrasound system for fusing an ultrasound image and an external medical image| US20080119723A1|2006-11-22|2008-05-22|Rainer Wegenkittl|Localizer Display System and Method| US20090033548A1|2007-08-01|2009-02-05|Camero-Tech Ltd.|System and method for volume visualization in through-the-obstacle imaging system| ITVI20080309A1|2008-12-24|2009-03-25|Milano Politecnico|SYSTEM AND METHOD FOR ADVANCED SCANNING AND SIMULATION OF SURFACE DEFORMATION.| US20100045783A1|2001-10-19|2010-02-25|Andrei State|Methods and systems for dynamic virtual convergence and head mountable display using same| US20100063615A1|2008-09-05|2010-03-11|Mori Seiki Co., Ltd.|Machining status monitoring method and machining status monitoring apparatus| US20100123715A1|2008-11-14|2010-05-20|General Electric Company|Method and system for navigating volumetric images| US7728868B2|2006-08-02|2010-06-01|Inneroptic Technology, Inc.|System and method of providing real-time dynamic imagery of a medical procedure site using multiple modalities| US20100152570A1|2006-04-12|2010-06-17|Nassir Navab|Virtual Penetrating Mirror Device for Visualizing Virtual Objects in Angiographic Applications| US20100149213A1|2006-04-12|2010-06-17|Nassir Navab|Virtual Penetrating Mirror Device for Visualizing of Virtual Objects within an Augmented Reality Environment| US20110046483A1|2008-01-24|2011-02-24|Henry Fuchs|Methods, systems, and computer readable media for image guided ablation| US20110043612A1|2009-07-31|2011-02-24|Inneroptic Technology Inc.|Dual-tube stereoscope| US20110057930A1|2006-07-26|2011-03-10|Inneroptic Technology Inc.|System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy| US20110082351A1|2009-10-07|2011-04-07|Inneroptic Technology, Inc.|Representing measurement information during a medical procedure| US20110087498A1|2009-10-09|2011-04-14|Dugel Pravin U|Surgical system providing identification of billing codes| US8340379B2|2008-03-07|2012-12-25|Inneroptic Technology, Inc.|Systems and methods for displaying guidance data based on updated deformable imaging data| US8554307B2|2010-04-12|2013-10-08|Inneroptic Technology, Inc.|Image annotation in image-guided medical procedures| US8585598B2|2009-02-17|2013-11-19|Inneroptic Technology, Inc.|Systems, methods, apparatuses, and computer-readable media for image guided surgery| US20130316318A1|2012-05-22|2013-11-28|Vivant Medical, Inc.|Treatment Planning System| US8641621B2|2009-02-17|2014-02-04|Inneroptic Technology, Inc.|Systems, methods, apparatuses, and computer-readable media for image management in image-guided medical procedures| US8670816B2|2012-01-30|2014-03-11|Inneroptic Technology, Inc.|Multiple medical device guidance| US20140132597A1|2011-07-20|2014-05-15|Toshiba Medical Systems Corporation|System, apparatus, and method for image processing and medical image diagnosis apparatus| US20140232840A1|2010-08-20|2014-08-21|Veran Medical Technologies, Inc.|Apparatus and method for four dimensional soft tissue navigation in endoscopic applications| CN105078514A|2014-04-22|2015-11-25|重庆海扶医疗科技股份有限公司|Construction method and device of three-dimensional model, image monitoring method and device| US9282947B2|2009-12-01|2016-03-15|Inneroptic Technology, Inc.|Imager focusing based on intraoperative data| US20160302747A1|2014-01-06|2016-10-20|Body Vision Medical Ltd.|Surgical devices and methods of use thereof| CN106470596A|2014-07-15|2017-03-01|索尼公司|There is review of computer aided surgery system and its operational approach of position registration mechanism| WO2017085532A1|2015-11-19|2017-05-26|Synaptive MedicalInc.|Neurosurgical mri-guided ultrasound via multi-modal image registration and multi-sensor fusion| US9675319B1|2016-02-17|2017-06-13|Inneroptic Technology, Inc.|Loupe display| WO2017107116A1|2015-12-24|2017-06-29|中国科学院深圳先进技术研究院|Navigation system for minimally invasive operation| CN107205691A|2014-10-20|2017-09-26|博迪维仁医疗有限公司|Operation device and its application method| US9901406B2|2014-10-02|2018-02-27|Inneroptic Technology, Inc.|Affected region display associated with a medical device| US20180059880A1|2015-02-16|2018-03-01|Dimensions And Shapes, Llc|Methods and systems for interactive three-dimensional electronic book| US9949700B2|2015-07-22|2018-04-24|Inneroptic Technology, Inc.|Medical device approaches| CN108289660A|2015-10-13|2018-07-17|马佐尔机器人有限公司|Global backbone alignment schemes| US10188467B2|2014-12-12|2019-01-29|Inneroptic Technology, Inc.|Surgical guidance intersection display| US10278778B2|2016-10-27|2019-05-07|Inneroptic Technology, Inc.|Medical device navigation using a virtual 3D space| US10314559B2|2013-03-14|2019-06-11|Inneroptic Technology, Inc.|Medical device guidance| DE102020102011A1|2020-01-28|2021-07-29|Carl Zeiss Meditec Ag|Eye surgery operating system with an OCT device as well as computer program and computer-implemented method for the continuous determination of a position of a surgical object| US11102381B1|2021-01-05|2021-08-24|Board Of Regents, The University Of Texas System Clearcam Inc.|Methods, systems and controllers for facilitating cleaning of an imaging element of an imaging device|US30397A||1860-10-16||Window-blind fastener | USRE30397E|1976-04-27|1980-09-09||Three-dimensional ultrasonic imaging of animal soft tissue| US4583538A|1984-05-04|1986-04-22|Onik Gary M|Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization| US5078140A|1986-05-08|1992-01-07|Kwoh Yik S|Imaging device - aided robotic stereotaxis system| US4770182A|1986-11-26|1988-09-13|Fonar Corporation|NMR screening method| US4945478A|1987-11-06|1990-07-31|Center For Innovative Technology|Noninvasive medical imaging system and method for the identification and 3-D display of atherosclerosis and the like| US4977505A|1988-05-24|1990-12-11|Arch Development Corporation|Means to correlate images from scans taken at different times including means to determine the minimum distances between a patient anatomical contour and a correlating surface| GB8828342D0|1988-12-05|1989-01-05|Rediffusion Simulation Ltd|Image generator| EP0647428A3|1989-11-08|1995-07-12|George S Allen|Interactive image-guided surgical system.| US5222499A|1989-11-15|1993-06-29|Allen George S|Method and apparatus for imaging the anatomy| US5070401A|1990-04-09|1991-12-03|Welch Allyn, Inc.|Video measurement system with automatic calibration and distortion correction| JP3112025B2|1990-10-26|2000-11-27|株式会社日立製作所|Biological measurement device| US6006126A|1991-01-28|1999-12-21|Cosman; Eric R.|System and method for stereotactic registration of image scan data| US5313306A|1991-05-13|1994-05-17|Telerobotics International, Inc.|Omniview motionless camera endoscopy system| US5261404A|1991-07-08|1993-11-16|Mick Peter R|Three-dimensional mammal anatomy imaging system and method| US5608849A|1991-08-27|1997-03-04|King, Jr.; Donald|Method of visual guidance for positioning images or data in three-dimensional space| US5299253A|1992-04-10|1994-03-29|Akzo N.V.|Alignment system to overlay abdominal computer aided tomography and magnetic resonance anatomy with single photon emission tomography| US5603318A|1992-04-21|1997-02-18|University Of Utah Research Foundation|Apparatus and method for photogrammetric surgical localization| US5389101A|1992-04-21|1995-02-14|University Of Utah|Apparatus and method for photogrammetric surgical localization| US5417210A|1992-05-27|1995-05-23|International Business Machines Corporation|System and method for augmentation of endoscopic surgery| AT399647B|1992-07-31|1995-06-26|Truppe Michael|ARRANGEMENT FOR DISPLAYING THE INTERIOR OF BODIES| US5337732A|1992-09-16|1994-08-16|Cedars-Sinai Medical Center|Robotic endoscopy| US5585813A|1992-10-05|1996-12-17|Rockwell International Corporation|All aspect head aiming display| US5842473A|1993-11-29|1998-12-01|Life Imaging Systems|Three-dimensional imaging system| CA2110148C|1992-12-24|1999-10-05|Aaron Fenster|Three-dimensional ultrasound imaging system| US5528735A|1993-03-23|1996-06-18|Silicon Graphics Inc.|Method and apparatus for displaying data within a three-dimensional information landscape| JPH07508449A|1993-04-20|1995-09-21||| EP0700269B1|1993-04-22|2002-12-11|Image Guided Technologies, Inc.|System for locating relative positions of objects| DE9422172U1|1993-04-26|1998-08-06|Univ St Louis|Specify the location of a surgical probe| US5391199A|1993-07-20|1995-02-21|Biosense, Inc.|Apparatus and method for treating cardiac arrhythmias| US5540229A|1993-09-29|1996-07-30|U.S. Philips Cororation|System and method for viewing three-dimensional echographic data| US5558091A|1993-10-06|1996-09-24|Biosense, Inc.|Magnetic determination of position and orientation| US5815126A|1993-10-22|1998-09-29|Kopin Corporation|Monocular portable communication and display system| US5454371A|1993-11-29|1995-10-03|London Health Association|Method and system for constructing and displaying three-dimensional images| US5491510A|1993-12-03|1996-02-13|Texas Instruments Incorporated|System and method for simultaneously viewing a scene and an obscured object| US5458126A|1994-02-24|1995-10-17|General Electric Company|Cardiac functional analysis system employing gradient image segmentation| JP3503982B2|1994-03-18|2004-03-08|富士通株式会社|Viewpoint setting device| US5531520A|1994-09-01|1996-07-02|Massachusetts Institute Of Technology|System and method of registration of three-dimensional data sets including anatomical body data| EP0951874A3|1994-09-15|2000-06-14|Visualization Technology, Inc.|Position tracking and imaging system for use in medical applications using a reference unit secured to a patients head| US5611025A|1994-11-23|1997-03-11|General Electric Company|Virtual internal cavity inspection system| US5546807A|1994-12-02|1996-08-20|Oxaal; John T.|High speed volumetric ultrasound imaging system| JP3539645B2|1995-02-16|2004-07-07|株式会社日立製作所|Remote surgery support device| US5868673A|1995-03-28|1999-02-09|Sonometrics Corporation|System for carrying out surgery, biopsy and ablation of a tumor or other physical anomaly| US5797849A|1995-03-28|1998-08-25|Sonometrics Corporation|Method for carrying out a medical procedure using a three-dimensional tracking and imaging system| US5882206A|1995-03-29|1999-03-16|Gillio; Robert G.|Virtual surgery system| US5833627A|1995-04-13|1998-11-10|United States Surgical Corporation|Image-guided biopsy apparatus and methods of use| US5887121A|1995-04-21|1999-03-23|International Business Machines Corporation|Method of constrained Cartesian control of robotic mechanisms with active and passive joints| US5892538A|1995-06-30|1999-04-06|Ericsson Inc.|True three-dimensional imaging and display system| US5776050A|1995-07-24|1998-07-07|Medical Media Systems|Anatomical visualization system| US5772594A|1995-10-17|1998-06-30|Barrick; Earl F.|Fluoroscopic image guided orthopaedic surgery system with intraoperative registration| US5682886A|1995-12-26|1997-11-04|Musculographics Inc|Computer-assisted surgical system| US5781195A|1996-04-16|1998-07-14|Microsoft Corporation|Method and system for rendering two-dimensional views of a three-dimensional surface| US6167296A|1996-06-28|2000-12-26|The Board Of Trustees Of The Leland Stanford Junior University|Method for volumetric image navigation| US6016439A|1996-10-15|2000-01-18|Biosense, Inc.|Method and apparatus for synthetic viewpoint imaging|FR2652928B1|1989-10-05|1994-07-29|Diadix Sa|INTERACTIVE LOCAL INTERVENTION SYSTEM WITHIN A AREA OF A NON-HOMOGENEOUS STRUCTURE.| US7635390B1|2000-01-14|2009-12-22|Marctec, Llc|Joint replacement component having a modular articulating surface| JP3432825B2|1992-08-14|2003-08-04|ブリテイッシュ・テレコミュニケーションズ・パブリック・リミテッド・カンパニー|Positioning system| US6256529B1|1995-07-26|2001-07-03|Burdette Medical Systems, Inc.|Virtual reality 3D visualization for surgical procedures| US6167296A|1996-06-28|2000-12-26|The Board Of Trustees Of The Leland Stanford Junior University|Method for volumetric image navigation| US6346940B1|1997-02-27|2002-02-12|Kabushiki Kaisha Toshiba|Virtualized endoscope system| US6434507B1|1997-09-05|2002-08-13|Surgical Navigation Technologies, Inc.|Medical instrument and method for use with computer-assisted image guided surgery| US6226548B1|1997-09-24|2001-05-01|Surgical Navigation Technologies, Inc.|Percutaneous registration apparatus and method for use in computer-assisted surgical navigation| US6021343A|1997-11-20|2000-02-01|Surgical Navigation Technologies|Image guided awl/tap/screwdriver| US6348058B1|1997-12-12|2002-02-19|Surgical Navigation Technologies, Inc.|Image guided spinal surgery guide, system, and method for use thereof| US6327490B1|1998-02-27|2001-12-04|Varian Medical Systems, Inc.|Brachytherapy system for prostate cancer treatment with computer implemented systems and processes to facilitate pre-implantation planning and post-implantation evaluations with storage of multiple plan variations for a single patient| US6360116B1|1998-02-27|2002-03-19|Varian Medical Systems, Inc.|Brachytherapy system for prostate cancer treatment with computer implemented systems and processes to facilitate pre-operative planning and post-operative evaluations| US6477400B1|1998-08-20|2002-11-05|Sofamor Danek Holdings, Inc.|Fluoroscopic image guided orthopaedic surgery system with intraoperative registration| JP4223596B2|1998-09-16|2009-02-12|Hoya株式会社|Electronic endoscope system| US8944070B2|1999-04-07|2015-02-03|Intuitive Surgical Operations, Inc.|Non-force reflecting method for providing tool force information to a user of a telesurgical system| JP4342016B2|1999-01-06|2009-10-14|株式会社日立メディコ|Image display device| US6556695B1|1999-02-05|2003-04-29|Mayo Foundation For Medical Education And Research|Method for producing high resolution real-time images, of structure and function during medical procedures| US6470207B1|1999-03-23|2002-10-22|Surgical Navigation Technologies, Inc.|Navigational guidance via computer-assisted fluoroscopic imaging| US6491699B1|1999-04-20|2002-12-10|Surgical Navigation Technologies, Inc.|Instrument guidance method and system for image guided surgery| US9572519B2|1999-05-18|2017-02-21|Mediguide Ltd.|Method and apparatus for invasive device tracking using organ timing signal generated from MPS sensors| US7778688B2|1999-05-18|2010-08-17|MediGuide, Ltd.|System and method for delivering a stent to a selected position within a lumen| US9833167B2|1999-05-18|2017-12-05|Mediguide Ltd.|Method and system for superimposing virtual anatomical landmarks on an image| JP4421016B2|1999-07-01|2010-02-24|東芝医用システムエンジニアリング株式会社|Medical image processing device| WO2001019252A1|1999-09-14|2001-03-22|Hitachi Medical Corporation|Biological light measuring instrument| US6474341B1|1999-10-28|2002-11-05|Surgical Navigation Technologies, Inc.|Surgical communication and power system| US6381485B1|1999-10-28|2002-04-30|Surgical Navigation Technologies, Inc.|Registration of human anatomy integrated for electromagnetic localization| US6499488B1|1999-10-28|2002-12-31|Winchester Development Associates|Surgical sensor| US8644907B2|1999-10-28|2014-02-04|Medtronic Navigaton, Inc.|Method and apparatus for surgical navigation| US6493573B1|1999-10-28|2002-12-10|Winchester Development Associates|Method and system for navigating a catheter probe in the presence of field-influencing objects| US6544178B1|1999-11-05|2003-04-08|Volumetrics Medical Imaging|Methods and systems for volume rendering using ultrasound data| US6671538B1|1999-11-26|2003-12-30|Koninklijke Philips Electronics, N.V.|Interface system for use with imaging devices to facilitate visualization of image-guided interventional procedure planning| AU4305201A|1999-11-29|2001-06-04|Board Of Trustees Of The Leland Stanford Junior University|Method and apparatus for transforming view orientations in image-guided surgery| WO2001041685A2|1999-12-10|2001-06-14|Iscience Corporation|Treatment of ocular disease| JP2001197485A|2000-01-11|2001-07-19|Asahi Optical Co Ltd|Electronic endoscope system and electronic endoscope signal switching device| US7708741B1|2001-08-28|2010-05-04|Marctec, Llc|Method of preparing bones for knee replacement surgery| WO2001062173A2|2000-02-25|2001-08-30|The Board Of Trustees Of The Leland Stanford Junior University|Methods and apparatuses for maintaining a trajectory in sterotaxi for tracking a target inside a body| WO2001064124A1|2000-03-01|2001-09-07|Surgical Navigation Technologies, Inc.|Multiple cannula image guided tool for image guided procedures| US6535756B1|2000-04-07|2003-03-18|Surgical Navigation Technologies, Inc.|Trajectory storage apparatus and method for surgical navigation system| US6889073B2|2000-05-08|2005-05-03|David A. Lampman|Breast biopsy and therapy system for magnetic resonance imagers| JP2004513673A|2000-05-09|2004-05-13|ペイエオン・インコーポレーテツド|System and method for three-dimensional reconstruction of arteries| US7085400B1|2000-06-14|2006-08-01|Surgical Navigation Technologies, Inc.|System and method for image based sensor calibration| US7555333B2|2000-06-19|2009-06-30|University Of Washington|Integrated optical scanning image acquisition and display| US6594516B1|2000-07-18|2003-07-15|Koninklijke Philips Electronics, N.V.|External patient contouring| JP4674948B2|2000-09-29|2011-04-20|オリンパス株式会社|Surgical navigation device and method of operating surgical navigation device| AT493070T|2000-10-18|2011-01-15|Paieon Inc|SYSTEM FOR POSITIONING A DEVICE IN A TUBULAR ORGAN| US6917827B2|2000-11-17|2005-07-12|Ge Medical Systems Global Technology Company, Llc|Enhanced graphic features for computer assisted surgery system| US6718194B2|2000-11-17|2004-04-06|Ge Medical Systems Global Technology Company, Llc|Computer assisted intramedullary rod surgery system with enhanced features| US6650928B1|2000-11-27|2003-11-18|Ge Medical Systems Global Technology Company, Llc|Color parametric and composite maps for CT perfusion| US20020149628A1|2000-12-22|2002-10-17|Smith Jeffrey C.|Positioning an item in three dimensions via a graphical representation| DE10105592A1|2001-02-06|2002-08-08|Achim Goepferich|Placeholder for drug release in the frontal sinus| CA2334495A1|2001-02-06|2002-08-06|Surgical Navigation Specialists, Inc.|Computer-aided positioning method and system| US6923817B2|2001-02-27|2005-08-02|Smith & Nephew, Inc.|Total knee arthroplasty systems and processes| US7046831B2|2001-03-09|2006-05-16|Tomotherapy Incorporated|System and method for fusion-aligned reprojection of incomplete data| US20040082863A1|2002-03-15|2004-04-29|Mcgreevy James|Device and method for the photodynamic diagnosis of tumor tissue| US7003175B2|2001-03-28|2006-02-21|Siemens Corporate Research, Inc.|Object-order multi-planar reformatting| WO2002085212A2|2001-04-10|2002-10-31|Koninklijke Philips Electronics N.V.|A fluoroscopy intervention method with a cone-beam| US7526112B2|2001-04-30|2009-04-28|Chase Medical, L.P.|System and method for facilitating cardiac intervention| US7327862B2|2001-04-30|2008-02-05|Chase Medical, L.P.|System and method for facilitating cardiac intervention| WO2002093495A1|2001-05-11|2002-11-21|Koninklijke Philips Electronics N.V.|Method, system and computer program for producing a medical report| US6636757B1|2001-06-04|2003-10-21|Surgical Navigation Technologies, Inc.|Method and apparatus for electromagnetic navigation of a surgical probe near a metal object| US7853312B2|2001-06-07|2010-12-14|Varian Medical Systems, Inc.|Seed localization system for use in an ultrasound system and method of using the same| US6549802B2|2001-06-07|2003-04-15|Varian Medical Systems, Inc.|Seed localization system and method in ultrasound by fluoroscopy and ultrasound fusion| EP1395195A1|2001-06-13|2004-03-10|Volume Interactions Pte. Ltd.|A guide system and a probe therefor| US6990220B2|2001-06-14|2006-01-24|Igo Technologies Inc.|Apparatuses and methods for surgical navigation| DE10136709B4|2001-07-27|2004-09-02|Siemens Ag|Device for performing surgical interventions and method for displaying image information during such an intervention on a patient| ITMI20011635A1|2001-07-27|2003-01-27|G D S Giorgi Dynamic Stereotax|DEVICE AND PROCESS OF MICROSURGERY ASSISTED BY THE PROCESSOR| AU2002324775A1|2001-08-23|2003-03-10|Sciperio, Inc.|Architecture tool and methods of use| US7438685B2|2001-11-05|2008-10-21|Computerized Medical Systems, Inc.|Apparatus and method for registration, guidance and targeting of external beam radiation therapy| JP4032410B2|2001-11-09|2008-01-16|ソニー株式会社|Information processing system, information processing method, program, recording medium, and information processing apparatus| CA2466811A1|2001-11-21|2003-06-05|Viatronix Incorporated|Imaging system and method for cardiac analysis| US20030152897A1|2001-12-20|2003-08-14|Bernhard Geiger|Automatic navigation for virtual endoscopy| US6741883B2|2002-02-28|2004-05-25|Houston Stereotactic Concepts, Inc.|Audible feedback from positional guidance systems| US6947786B2|2002-02-28|2005-09-20|Surgical Navigation Technologies, Inc.|Method and apparatus for perspective inversion| EP1340470B1|2002-03-01|2004-09-15|BrainLAB AG|Operation theatre lighting device with camera system for three-dimensional referencing| US6990368B2|2002-04-04|2006-01-24|Surgical Navigation Technologies, Inc.|Method and apparatus for virtual digital subtraction angiography| US7998062B2|2004-03-29|2011-08-16|Superdimension, Ltd.|Endoscope structures and techniques for navigating to a target in branched structure| AU2003219393A1|2002-05-03|2003-11-17|Koninklijke Philips Electronics N.V.|Method of producing and displaying an image of a 3 dimensional volume| EP1514243A1|2002-06-19|2005-03-16|Siemens Aktiengesellschaft|Cross-platform and data-specific visualisation of 3d data records| WO2004000151A1|2002-06-25|2003-12-31|Michael Nicholas Dalton|Apparatus and method for superimposing images over an object| DE10232676B4|2002-07-18|2006-01-19|Siemens Ag|Method and device for positioning a patient in a medical diagnostic or therapeutic device| US8275091B2|2002-07-23|2012-09-25|Rapiscan Systems, Inc.|Compact mobile cargo scanning system| US7963695B2|2002-07-23|2011-06-21|Rapiscan Systems, Inc.|Rotatable boom cargo scanning system| US7630752B2|2002-08-06|2009-12-08|Stereotaxis, Inc.|Remote control of medical devices using a virtual device interface| US8317816B2|2002-09-30|2012-11-27|Acclarent, Inc.|Balloon catheters and methods for treating paranasal sinuses| ES2204322B1|2002-10-01|2005-07-16|Consejo Sup. De Invest. Cientificas|FUNCTIONAL BROWSER.| US7289599B2|2002-10-04|2007-10-30|Varian Medical Systems Technologies, Inc.|Radiation process and apparatus| US7869861B2|2002-10-25|2011-01-11|Howmedica Leibinger Inc.|Flexible tracking article and method of using the same| DE10252837B4|2002-11-13|2005-03-24|Carl Zeiss|Examination system and examination procedure| US7758508B1|2002-11-15|2010-07-20|Koninklijke Philips Electronics, N.V.|Ultrasound-imaging systems and methods for a user-guided three-dimensional volume-scan sequence| US7599730B2|2002-11-19|2009-10-06|Medtronic Navigation, Inc.|Navigation system for cardiac therapies| US7697972B2|2002-11-19|2010-04-13|Medtronic Navigation, Inc.|Navigation system for cardiac therapies| US7319897B2|2002-12-02|2008-01-15|Aesculap Ag & Co. Kg|Localization device display method and apparatus| US6991605B2|2002-12-18|2006-01-31|Siemens Medical Solutions Usa, Inc.|Three-dimensional pictograms for use with medical images| US7693563B2|2003-01-30|2010-04-06|Chase Medical, LLP|Method for image processing and contour assessment of the heart| US20070014452A1|2003-12-01|2007-01-18|Mitta Suresh|Method and system for image processing and assessment of a state of a heart| US20050043609A1|2003-01-30|2005-02-24|Gregory Murphy|System and method for facilitating cardiac intervention| US7542791B2|2003-01-30|2009-06-02|Medtronic Navigation, Inc.|Method and apparatus for preplanning a surgical procedure| US7660623B2|2003-01-30|2010-02-09|Medtronic Navigation, Inc.|Six degree of freedom alignment display for medical procedures| US7559890B2|2003-02-26|2009-07-14|Ikona Medical Corporation|Endoscopic imaging of an organ system| US20100262000A1|2003-02-26|2010-10-14|Wallace Jeffrey M|Methods and devices for endoscopic imaging| US7744528B2|2003-02-26|2010-06-29|Infinite Biomedical Technologies, Llc|Methods and devices for endoscopic imaging| WO2004075755A1|2003-02-28|2004-09-10|Matsushita Electric Industrial Co., Ltd.|Ultrasonographic display device| US7333644B2|2003-03-11|2008-02-19|Siemens Medical Solutions Usa, Inc.|Systems and methods for providing automatic 3D lesion segmentation and measurements| US7304644B2|2003-03-12|2007-12-04|Siemens Medical Solutions Usa, Inc.|System and method for performing a virtual endoscopy| US7620220B2|2003-03-21|2009-11-17|Boston Scientific Scimed, Inc.|Scan conversion of medical imaging data from polar format to cartesian format| JP2007512854A|2003-04-28|2007-05-24|ブラッコ イメージング ソチエタ ペル アチオニ|Surgical navigation system | JP4300488B2|2003-05-08|2009-07-22|株式会社日立メディコ|Reference image display method and ultrasonic diagnostic apparatus in ultrasonic diagnosis| DE10323217A1|2003-05-22|2004-12-16|Siemens Ag|Optical coherent tomography system of examination of tissues or organs, has position sensor at tip of catheter and reconstructs volume image based on sectional images and associated position data| US7194120B2|2003-05-29|2007-03-20|Board Of Regents, The University Of Texas System|Methods and systems for image-guided placement of implants| US20050033117A1|2003-06-02|2005-02-10|Olympus Corporation|Object observation system and method of controlling object observation system| US20050065424A1|2003-06-06|2005-03-24|Ge Medical Systems Information Technologies, Inc.|Method and system for volumemetric navigation supporting radiological reading in medical imaging systems| US6928141B2|2003-06-20|2005-08-09|Rapiscan, Inc.|Relocatable X-ray imaging system and method for inspecting commercial vehicles and cargo containers| US20050010105A1|2003-07-01|2005-01-13|Sra Jasbir S.|Method and system for Coronary arterial intervention| US20050004580A1|2003-07-01|2005-01-06|Tommi Jokiniemi|System for pointing a lesion in an X-rayed object| US20050015004A1|2003-07-17|2005-01-20|Hertel Sarah Rose|Systems and methods for combining an anatomic structure and metabolic activity for an object| WO2005008583A2|2003-07-21|2005-01-27|Paieon Inc.|Method and system for identifying an optimal image within a series of images that depict a moving organ| US8403828B2|2003-07-21|2013-03-26|Vanderbilt University|Ophthalmic orbital surgery apparatus and method and image-guide navigation system| CN1846235A|2003-07-22|2006-10-11|Sgdl系统有限公司|Acquisition method and apparatus for generating m-degree forms in a n-dimension space| US7343030B2|2003-08-05|2008-03-11|Imquant, Inc.|Dynamic tumor treatment system| US8055323B2|2003-08-05|2011-11-08|Imquant, Inc.|Stereotactic system and method for defining a tumor treatment region| EP1653877A1|2003-08-07|2006-05-10|Xoran Technologies, Inc.|Intra-operative ct scanner| US7398116B2|2003-08-11|2008-07-08|Veran Medical Technologies, Inc.|Methods, apparatuses, and systems useful in conducting image guided interventions| US8150495B2|2003-08-11|2012-04-03|Veran Medical Technologies, Inc.|Bodily sealants and methods and apparatus for image-guided delivery of same| US20050159676A1|2003-08-13|2005-07-21|Taylor James D.|Targeted biopsy delivery system| US7313430B2|2003-08-28|2007-12-25|Medtronic Navigation, Inc.|Method and apparatus for performing stereotactic surgery| DE10339979B4|2003-08-29|2005-11-17|Tomtec Imaging Systems Gmbh|Method and device for displaying a predeterminable area in multi-dimensional data sets| US8000771B2|2003-09-02|2011-08-16|Cardiac Pacemakers, Inc.|Method and apparatus for catheterization by detecting signals indicating proximity to anatomical features| US20050054895A1|2003-09-09|2005-03-10|Hoeg Hans David|Method for using variable direction of view endoscopy in conjunction with image guided surgical systems| DE602004022432D1|2003-09-15|2009-09-17|Super Dimension Ltd|SYSTEM FROM ACCESSORIES FOR USE WITH BRONCHOSCOPES| EP2316328B1|2003-09-15|2012-05-09|Super Dimension Ltd.|Wrap-around holding device for use with bronchoscopes| US8276091B2|2003-09-16|2012-09-25|Ram Consulting|Haptic response system and method of use| US20050059887A1|2003-09-16|2005-03-17|Hassan Mostafavi|Localization of a target using in vivo markers| US20050059879A1|2003-09-16|2005-03-17|Robert Sutherland|Localization of a sensor device in a body| US7742629B2|2003-09-25|2010-06-22|Paieon Inc.|System and method for three-dimensional reconstruction of a tubular organ| US20070038035A1|2003-10-01|2007-02-15|W.E.C.U. Technologies Ltd.|Method and system for screening and indicating individuals with hidden intent| US7862570B2|2003-10-03|2011-01-04|Smith & Nephew, Inc.|Surgical positioners| US7835778B2|2003-10-16|2010-11-16|Medtronic Navigation, Inc.|Method and apparatus for surgical navigation of a multiple piece construct for implantation| US7840253B2|2003-10-17|2010-11-23|Medtronic Navigation, Inc.|Method and apparatus for surgical navigation| US7366562B2|2003-10-17|2008-04-29|Medtronic Navigation, Inc.|Method and apparatus for surgical navigation| US8239001B2|2003-10-17|2012-08-07|Medtronic Navigation, Inc.|Method and apparatus for surgical navigation| US7764985B2|2003-10-20|2010-07-27|Smith & Nephew, Inc.|Surgical navigation system component fault interfaces and related processes| US20050085718A1|2003-10-21|2005-04-21|Ramin Shahidi|Systems and methods for intraoperative targetting| EP1689290A2|2003-10-21|2006-08-16|The Board of Trustees of The Leland Stanford Junior University|Systems and methods for intraoperative targeting| US20050085717A1|2003-10-21|2005-04-21|Ramin Shahidi|Systems and methods for intraoperative targetting| US20050113680A1|2003-10-29|2005-05-26|Yoshihiro Ikeda|Cerebral ischemia diagnosis assisting apparatus, X-ray computer tomography apparatus, and apparatus for aiding diagnosis and treatment of acute cerebral infarct| ES2362491T3|2003-11-14|2011-07-06|SMITH & NEPHEW, INC.|ADJUSTABLE SURGICAL CUTTING SYSTEMS.| US7232409B2|2003-11-20|2007-06-19|Karl Storz Development Corp.|Method and apparatus for displaying endoscopic images| US20070276214A1|2003-11-26|2007-11-29|Dachille Frank C|Systems and Methods for Automated Segmentation, Visualization and Analysis of Medical Images| EP1691666B1|2003-12-12|2012-05-30|University of Washington|Catheterscope 3d guidance and interface system| DE102004004620A1|2004-01-29|2005-08-25|Siemens Ag|Medical x-ray imaging method for recording an examination area for use in medical navigational procedures, whereby a spatial position of an examination area is recorded just prior to each shot and images then spatially compensated| US7333643B2|2004-01-30|2008-02-19|Chase Medical, L.P.|System and method for facilitating cardiac intervention| US20060036162A1|2004-02-02|2006-02-16|Ramin Shahidi|Method and apparatus for guiding a medical instrument to a subsurface target site in a patient| US8764725B2|2004-02-09|2014-07-01|Covidien Lp|Directional anchoring mechanism, method and applications thereof| US7580178B2|2004-02-13|2009-08-25|Angstrom, Inc.|Image-guided microsurgery system and method| US7668285B2|2004-02-16|2010-02-23|Kabushiki Kaisha Toshiba|X-ray computed tomographic apparatus and image processing apparatus| WO2005079492A2|2004-02-17|2005-09-01|Traxtal Technologies Inc.|Method and apparatus for registration, verification, and referencing of internal organs| CN1960680B|2004-02-20|2010-09-08|赫克托·O·帕切科|Method for determining size and placement of pedicle screw in spinal surgery| US9615772B2|2004-02-20|2017-04-11|Karl Storz Imaging, Inc.|Global endoscopic viewing indicator| WO2005084571A1|2004-03-03|2005-09-15|Deutsches Krebsforschungszentrum Stiftung des öffentlichen Rechts|Incremental real time recording of tracked instruments in tubular organ structures inside the human body| DE102004010544A1|2004-03-04|2005-09-22|Daimlerchrysler Ag|Safety device for a motor vehicle| US20050197558A1|2004-03-04|2005-09-08|Williams James P.|System and method for performing a virtual endoscopy in a branching structure| JP4630564B2|2004-03-30|2011-02-09|国立大学法人浜松医科大学|Surgery support apparatus, method and program| JP4493383B2|2004-04-01|2010-06-30|オリンパス株式会社|Procedure support system| US9033871B2|2004-04-07|2015-05-19|Karl Storz Imaging, Inc.|Gravity referenced endoscopic image orientation| US7361168B2|2004-04-21|2008-04-22|Acclarent, Inc.|Implantable device and methods for delivering drugs and other substances to treat sinusitis and other disorders| US7419497B2|2004-04-21|2008-09-02|Acclarent, Inc.|Methods for treating ethmoid disease| US7410480B2|2004-04-21|2008-08-12|Acclarent, Inc.|Devices and methods for delivering therapeutic substances for the treatment of sinusitis and other disorders| US7803150B2|2004-04-21|2010-09-28|Acclarent, Inc.|Devices, systems and methods useable for treating sinusitis| US9399121B2|2004-04-21|2016-07-26|Acclarent, Inc.|Systems and methods for transnasal dilation of passageways in the ear, nose or throat| US7654997B2|2004-04-21|2010-02-02|Acclarent, Inc.|Devices, systems and methods for diagnosing and treating sinusitus and other disorders of the ears, nose and/or throat| US8951225B2|2005-06-10|2015-02-10|Acclarent, Inc.|Catheters with non-removable guide members useable for treatment of sinusitis| US8146400B2|2004-04-21|2012-04-03|Acclarent, Inc.|Endoscopic methods and devices for transnasal procedures| US20060063973A1|2004-04-21|2006-03-23|Acclarent, Inc.|Methods and apparatus for treating disorders of the ear, nose and throat| US8932276B1|2004-04-21|2015-01-13|Acclarent, Inc.|Shapeable guide catheters and related methods| US9101384B2|2004-04-21|2015-08-11|Acclarent, Inc.|Devices, systems and methods for diagnosing and treating sinusitis and other disorders of the ears, Nose and/or throat| US10188413B1|2004-04-21|2019-01-29|Acclarent, Inc.|Deflectable guide catheters and related methods| US8764729B2|2004-04-21|2014-07-01|Acclarent, Inc.|Frontal sinus spacer| US9351750B2|2004-04-21|2016-05-31|Acclarent, Inc.|Devices and methods for treating maxillary sinus disease| US8864787B2|2004-04-21|2014-10-21|Acclarent, Inc.|Ethmoidotomy system and implantable spacer devices having therapeutic substance delivery capability for treatment of paranasal sinusitis| EP1737375B1|2004-04-21|2021-08-11|Smith & Nephew, Inc|Computer-aided navigation systems for shoulder arthroplasty| US9089258B2|2004-04-21|2015-07-28|Acclarent, Inc.|Endoscopic methods and devices for transnasal procedures| US8414473B2|2004-04-21|2013-04-09|Acclarent, Inc.|Methods and apparatus for treating disorders of the ear nose and throat| US8747389B2|2004-04-21|2014-06-10|Acclarent, Inc.|Systems for treating disorders of the ear, nose and throat| US20070208252A1|2004-04-21|2007-09-06|Acclarent, Inc.|Systems and methods for performing image guided procedures within the ear, nose, throat and paranasal sinuses| US7462175B2|2004-04-21|2008-12-09|Acclarent, Inc.|Devices, systems and methods for treating disorders of the ear, nose and throat| US20070167682A1|2004-04-21|2007-07-19|Acclarent, Inc.|Endoscopic methods and devices for transnasal procedures| US8894614B2|2004-04-21|2014-11-25|Acclarent, Inc.|Devices, systems and methods useable for treating frontal sinusitis| US20060004323A1|2004-04-21|2006-01-05|Exploramed Nc1, Inc.|Apparatus and methods for dilating and modifying ostia of paranasal sinuses and other intranasal or paranasal structures| US7720521B2|2004-04-21|2010-05-18|Acclarent, Inc.|Methods and devices for performing procedures within the ear, nose, throat and paranasal sinuses| US8702626B1|2004-04-21|2014-04-22|Acclarent, Inc.|Guidewires for performing image guided procedures| US9554691B2|2004-04-21|2017-01-31|Acclarent, Inc.|Endoscopic methods and devices for transnasal procedures| US7567834B2|2004-05-03|2009-07-28|Medtronic Navigation, Inc.|Method and apparatus for implantation between two vertebral bodies| US20050283070A1|2004-06-21|2005-12-22|Celina Imielinska|Systems and methods for qualifying symmetry to evaluate medical images| US20050285853A1|2004-06-29|2005-12-29|Ge Medical Systems Information Technologies, Inc.|3D display system and method| US7855727B2|2004-09-15|2010-12-21|Gyrus Acmi, Inc.|Endoscopy device supporting multiple input devices| JP4213100B2|2004-09-17|2009-01-21|富士通株式会社|Data transfer system and data transfer method| US20060074285A1|2004-09-24|2006-04-06|Paieon Inc.|Apparatus and method for fusion and in-operating-room presentation of volumetric data and 3-D angiographic data| US7831294B2|2004-10-07|2010-11-09|Stereotaxis, Inc.|System and method of surgical imagining with anatomical overlay for navigation of surgical devices| WO2006043238A1|2004-10-22|2006-04-27|Koninklijke Philips Electronics N.V.|Real time stereoscopic imaging apparatus and method| CA2586560A1|2004-11-05|2006-06-01|The Government Of The United States Of America, As Represented By The Se Cretary, Department Of Health And Human Services|Access system| US7811224B2|2004-11-09|2010-10-12|Karl Storz Development Corp.|Method for dealing with singularities in gravity referenced endoscopic imaging| US7751868B2|2004-11-12|2010-07-06|Philips Electronics Ltd|Integrated skin-mounted multifunction device for use in image-guided surgery| US7805269B2|2004-11-12|2010-09-28|Philips Electronics Ltd|Device and method for ensuring the accuracy of a tracking device in a volume| DE102004058122A1|2004-12-02|2006-07-13|Siemens Ag|Medical image registration aid for landmarks by computerized and photon emission tomographies, comprises permeable radioactive substance is filled with the emission tomography as radiation permeable containers, a belt and patient body bowl| GB2421572A|2004-12-22|2006-06-28|Elekta Ab|Radiotherapeutic apparatus| US20060142740A1|2004-12-29|2006-06-29|Sherman Jason T|Method and apparatus for performing a voice-assisted orthopaedic surgical procedure| US7840254B2|2005-01-18|2010-11-23|Philips Electronics Ltd|Electromagnetically tracked K-wire device| WO2006078678A2|2005-01-18|2006-07-27|Traxtal Inc.|Method and apparatus for guiding an instrument to a target in the lung| DE602006021274D1|2005-02-09|2011-05-26|Hitachi Medical Corp|ULTRASONIC EQUIPMENT AND ULTRASONIC PROCEDURE| US7967742B2|2005-02-14|2011-06-28|Karl Storz Imaging, Inc.|Method for using variable direction of view endoscopy in conjunction with image guided surgical systems| JP4860636B2|2005-02-17|2012-01-25|コーニンクレッカフィリップスエレクトロニクスエヌヴィ|Auto 3D display| US8177788B2|2005-02-22|2012-05-15|Smith & Nephew, Inc.|In-line milling system| US20080160489A1|2005-02-23|2008-07-03|Koninklijke Philips Electronics, N.V.|Method For the Prediction of the Course of a Catheter| US7530948B2|2005-02-28|2009-05-12|University Of Washington|Tethered capsule endoscope for Barrett's Esophagus screening| CN101528122B|2005-03-07|2011-09-07|赫克托·O·帕切科|Drivepipe used for inserting into vertebral pedicle| US8295577B2|2005-03-31|2012-10-23|Michael Zarkh|Method and apparatus for guiding a device in a totally occluded or partly occluded tubular organ| JP2008534109A|2005-03-31|2008-08-28|パイエオン インコーポレイテッド|Apparatus and method for positioning a device within a tubular organ| US7471764B2|2005-04-15|2008-12-30|Rapiscan Security Products, Inc.|X-ray imaging system having improved weather resistance| US10555775B2|2005-05-16|2020-02-11|Intuitive Surgical Operations, Inc.|Methods and system for performing 3-D tool tracking by fusion of sensor and/or camera derived data during minimally invasive robotic surgery| CA2613360A1|2005-06-21|2007-01-04|Traxtal Inc.|System, method and apparatus for navigated therapy and diagnosis| WO2008045016A2|2005-06-21|2008-04-17|Traxtal Inc.|Device and method for a trackable ultrasound| DE102005029242B4|2005-06-23|2012-10-25|Siemens Ag|Method for recording and evaluating image data of an examination object and associated device| DE102005029243A1|2005-06-23|2007-01-04|Siemens Ag|Method for displaying and processing at least one examination image of an examination object| WO2007011306A2|2005-07-20|2007-01-25|Bracco Imaging S.P.A.|A method of and apparatus for mapping a virtual model of an object to the object| DE102005034683A1|2005-07-25|2007-02-15|Siemens Ag|Method for generating computed tomographic images during an intervention| US7787699B2|2005-08-17|2010-08-31|General Electric Company|Real-time integration and recording of surgical image data| DE102005039657A1|2005-08-22|2007-03-22|Siemens Ag|Medical instrument e.g. catheter, representation method for x-ray diagnostic device, involves superimposing actual position of medical instrument in image of three-dimensional data record| US20070053486A1|2005-08-23|2007-03-08|Zelnik Deborah R|Methods and apparatus for nuclear tomo-cardiology scanning| WO2007025081A2|2005-08-24|2007-03-01|Traxtal Inc.|System, method and devices for navigated flexible endoscopy| US20070066881A1|2005-09-13|2007-03-22|Edwards Jerome R|Apparatus and method for image guided accuracy verification| US7835784B2|2005-09-21|2010-11-16|Medtronic Navigation, Inc.|Method and apparatus for positioning a reference frame| US8114113B2|2005-09-23|2012-02-14|Acclarent, Inc.|Multi-conduit balloon catheter| US7912258B2|2005-09-27|2011-03-22|Vanderbilt University|Method and apparatus for standardizing ultrasonography training using image to physical space registration of tomographic volumes from tracked ultrasound| EP1772745B1|2005-10-06|2008-08-27|MedCom Gesellschaft für medizinische Bildverarbeitung mbH|Registering 2D ultrasound image data and 3D image data of an object| US9141254B2|2005-11-12|2015-09-22|Orthosensor Inc|Navigation system and user interface for directing a control action| DE102005055664B4|2005-11-22|2014-08-14|Siemens Aktiengesellschaft|Method for determining ordinal numbers of spatial elements assigned to spatial points| US20070116328A1|2005-11-23|2007-05-24|Sezai Sablak|Nudity mask for use in displaying video camera images| WO2007067163A1|2005-11-23|2007-06-14|University Of Washington|Scanning beam with variable sequential framing using interrupted scanning resonance| US8303505B2|2005-12-02|2012-11-06|Abbott Cardiovascular Systems Inc.|Methods and apparatuses for image guided medical procedures| US8401264B2|2005-12-08|2013-03-19|University Of Washington|Solid modeling based on volumetric scans| US20070152874A1|2005-12-30|2007-07-05|Woodington Walter G|Reducing undesirable coupling of signal between two or more signal paths in a radar system| US9168102B2|2006-01-18|2015-10-27|Medtronic Navigation, Inc.|Method and apparatus for providing a container to a sterile environment| US8219178B2|2007-02-16|2012-07-10|Catholic Healthcare West|Method and system for performing invasive medical procedures using a surgical robot| US10624710B2|2012-06-21|2020-04-21|Globus Medical, Inc.|System and method for measuring depth of instrumentation| US10842461B2|2012-06-21|2020-11-24|Globus Medical, Inc.|Systems and methods of checking registrations for surgical systems| US10799298B2|2012-06-21|2020-10-13|Globus Medical Inc.|Robotic fluoroscopic navigation| US10874466B2|2012-06-21|2020-12-29|Globus Medical, Inc.|System and method for surgical tool insertion using multiaxis force and moment feedback| US10136954B2|2012-06-21|2018-11-27|Globus Medical, Inc.|Surgical tool systems and method| US10758315B2|2012-06-21|2020-09-01|Globus Medical Inc.|Method and system for improving 2D-3D registration convergence| US10893912B2|2006-02-16|2021-01-19|Globus Medical Inc.|Surgical tool systems and methods| US11045267B2|2012-06-21|2021-06-29|Globus Medical, Inc.|Surgical robotic automation with tracking markers| US10231791B2|2012-06-21|2019-03-19|Globus Medical, Inc.|Infrared signal based position recognition system for use with a robot-assisted surgery| US10646280B2|2012-06-21|2020-05-12|Globus Medical, Inc.|System and method for surgical tool insertion using multiaxis force and moment feedback| US10653497B2|2006-02-16|2020-05-19|Globus Medical, Inc.|Surgical tool systems and methods| IL181470A|2006-02-24|2012-04-30|Visionsense Ltd|Method and system for navigating within a flexible organ of the body of a patient| WO2007100303A1|2006-03-01|2007-09-07|Agency For Science, Technology & Research|A method and system for obtaining multiple views of an object for real-time video output| US20070297560A1|2006-03-03|2007-12-27|Telesecurity Sciences, Inc.|Method and system for electronic unpacking of baggage and cargo| JP2009528128A|2006-03-03|2009-08-06|ユニヴァーシティオブワシントン|Multi-clad optical fiber scanner| US20070236514A1|2006-03-29|2007-10-11|Bracco Imaging Spa|Methods and Apparatuses for Stereoscopic Image Guided Surgical Navigation| US8112292B2|2006-04-21|2012-02-07|Medtronic Navigation, Inc.|Method and apparatus for optimizing a therapy| WO2007131561A2|2006-05-16|2007-11-22|Surgiceye Gmbh|Method and device for 3d acquisition, 3d visualization and computer guided surgery using nuclear probes| US8190389B2|2006-05-17|2012-05-29|Acclarent, Inc.|Adapter for attaching electromagnetic image guidance components to a medical device| EP1857070A1|2006-05-18|2007-11-21|BrainLAB AG|Contactless medical registration with distance measurement| WO2007136745A2|2006-05-19|2007-11-29|University Of Hawaii|Motion tracking system for real time adaptive imaging and spectroscopy| US8040519B2|2006-05-23|2011-10-18|Hitachi Medical Corporation|Biological optical measurement apparatus| US9469034B2|2007-06-13|2016-10-18|Intuitive Surgical Operations, Inc.|Method and system for switching modes of a robotic system| US9138129B2|2007-06-13|2015-09-22|Intuitive Surgical Operations, Inc.|Method and system for moving a plurality of articulated instruments in tandem back towards an entry guide| US7729752B2|2006-06-13|2010-06-01|Rhythmia Medical, Inc.|Non-contact cardiac mapping, including resolution map| US20090192523A1|2006-06-29|2009-07-30|Intuitive Surgical, Inc.|Synthetic representation of a surgical instrument| US8620473B2|2007-06-13|2013-12-31|Intuitive Surgical Operations, Inc.|Medical robotic system with coupled control modes| US7515954B2|2006-06-13|2009-04-07|Rhythmia Medical, Inc.|Non-contact cardiac mapping, including moving catheter and multi-beat integration| US7505810B2|2006-06-13|2009-03-17|Rhythmia Medical, Inc.|Non-contact cardiac mapping, including preprocessing| KR101477125B1|2006-06-13|2014-12-29|인튜어티브 서지컬 인코포레이티드|Minimally invasive surgical system| US8560047B2|2006-06-16|2013-10-15|Board Of Regents Of The University Of Nebraska|Method and apparatus for computer aided surgery| CN101478913B|2006-06-28|2010-12-01|赫克托·O·帕切科|Apparatus and methods for templating and placement of artificial discs| US10008017B2|2006-06-29|2018-06-26|Intuitive Surgical Operations, Inc.|Rendering tool information as graphic overlays on displayed images of tools| US9789608B2|2006-06-29|2017-10-17|Intuitive Surgical Operations, Inc.|Synthetic representation of a surgical robot| US9718190B2|2006-06-29|2017-08-01|Intuitive Surgical Operations, Inc.|Tool position and identification indicator displayed in a boundary area of a computer display screen| GB0613576D0|2006-07-10|2006-08-16|Leuven K U Res & Dev|Endoscopic vision system| CA2658510C|2006-07-21|2013-01-15|Orthosoft Inc.|Non-invasive tracking of bones for surgery| US20080058629A1|2006-08-21|2008-03-06|University Of Washington|Optical fiber scope with both non-resonant illumination and resonant collection/imaging for multiple modes of operation| US8160677B2|2006-09-08|2012-04-17|Medtronic, Inc.|Method for identification of anatomical landmarks| US8160676B2|2006-09-08|2012-04-17|Medtronic, Inc.|Method for planning a surgical procedure| US8150498B2|2006-09-08|2012-04-03|Medtronic, Inc.|System for identification of anatomical landmarks| EP2063802B1|2006-09-08|2019-03-13|Medtronic, Inc.|System for navigating a planned procedure within a body| US8150497B2|2006-09-08|2012-04-03|Medtronic, Inc.|System for navigating a planned procedure within a body| US9820688B2|2006-09-15|2017-11-21|Acclarent, Inc.|Sinus illumination lightwire device| US7559925B2|2006-09-15|2009-07-14|Acclarent Inc.|Methods and devices for facilitating visualization in a surgical environment| US7824328B2|2006-09-18|2010-11-02|Stryker Corporation|Method and apparatus for tracking a surgical instrument during surgery| US20080071141A1|2006-09-18|2008-03-20|Abhisuek Gattani|Method and apparatus for measuring attributes of an anatomical feature during a medical procedure| US7945310B2|2006-09-18|2011-05-17|Stryker Corporation|Surgical instrument path computation and display for endoluminal surgery| US8248413B2|2006-09-18|2012-08-21|Stryker Corporation|Visual navigation system for endoscopic surgery| US8248414B2|2006-09-18|2012-08-21|Stryker Corporation|Multi-dimensional navigation of endoscopic video| US20080123910A1|2006-09-19|2008-05-29|Bracco Imaging Spa|Method and system for providing accuracy evaluation of image guided surgery| US20080086051A1|2006-09-20|2008-04-10|Ethicon Endo-Surgery, Inc.|System, storage medium for a computer program, and method for displaying medical images| US8660635B2|2006-09-29|2014-02-25|Medtronic, Inc.|Method and apparatus for optimizing a computer assisted surgical procedure| US8052598B2|2006-10-12|2011-11-08|General Electric Company|Systems and methods for calibrating an endoscope| US8401620B2|2006-10-16|2013-03-19|Perfint Healthcare Private Limited|Needle positioning apparatus and method| US7831096B2|2006-11-17|2010-11-09|General Electric Company|Medical navigation system with tool and/or implant integration into fluoroscopic image projections and method of use| US20080132834A1|2006-12-04|2008-06-05|University Of Washington|Flexible endoscope tip bending mechanism using optical fibers as tension members| US8439687B1|2006-12-29|2013-05-14|Acclarent, Inc.|Apparatus and method for simulated insertion and positioning of guidewares and other interventional devices| US8834372B2|2007-01-26|2014-09-16|Fujifilm Sonosite, Inc.|System and method for optimized spatio-temporal sampling| US20080190438A1|2007-02-08|2008-08-14|Doron Harlev|Impedance registration and catheter tracking| US10357184B2|2012-06-21|2019-07-23|Globus Medical, Inc.|Surgical tool systems and method| US10350013B2|2012-06-21|2019-07-16|Globus Medical, Inc.|Surgical tool systems and methods| US11116576B2|2012-06-21|2021-09-14|Globus Medical Inc.|Dynamic reference arrays and methods of use| WO2008103383A1|2007-02-20|2008-08-28|Gildenberg Philip L|Videotactic and audiotactic assisted surgical methods and procedures| US20080221388A1|2007-03-09|2008-09-11|University Of Washington|Side viewing optical fiber endoscope| US20080221434A1|2007-03-09|2008-09-11|Voegele James W|Displaying an internal image of a body lumen of a patient| US20080234544A1|2007-03-20|2008-09-25|Ethicon Endo-Sugery, Inc.|Displaying images interior and exterior to a body lumen of a patient| US8457718B2|2007-03-21|2013-06-04|Ethicon Endo-Surgery, Inc.|Recognizing a real world fiducial in a patient image data| US8081810B2|2007-03-22|2011-12-20|Ethicon Endo-Surgery, Inc.|Recognizing a real world fiducial in image data of a patient| US8840566B2|2007-04-02|2014-09-23|University Of Washington|Catheter with imaging capability acts as guidewire for cannula tools| US20080243030A1|2007-04-02|2008-10-02|University Of Washington|Multifunction cannula tools| US7978208B2|2007-04-16|2011-07-12|General Electric Company|Systems and methods for multi-source video distribution and composite display| JP5497630B2|2007-04-26|2014-05-21|コーニンクレッカフィリップスエヌヴェ|Risk indication for surgical procedures| US8118757B2|2007-04-30|2012-02-21|Acclarent, Inc.|Methods and devices for ostium measurement| US7952718B2|2007-05-03|2011-05-31|University Of Washington|High resolution optical coherence tomography based imaging for intraluminal and interstitial use implemented with a reduced form factor| US8485199B2|2007-05-08|2013-07-16|Acclarent, Inc.|Methods and devices for protecting nasal turbinate during surgery| US8989842B2|2007-05-16|2015-03-24|General Electric Company|System and method to register a tracking system with intracardiac echocardiographyimaging system| US8428690B2|2007-05-16|2013-04-23|General Electric Company|Intracardiac echocardiography image reconstruction in combination with position tracking system| US8527032B2|2007-05-16|2013-09-03|General Electric Company|Imaging system and method of delivery of an instrument to an imaged subject| US8364242B2|2007-05-17|2013-01-29|General Electric Company|System and method of combining ultrasound image acquisition with fluoroscopic image acquisition| US8934961B2|2007-05-18|2015-01-13|Biomet Manufacturing, Llc|Trackable diagnostic scope apparatus and methods of use| US20100266171A1|2007-05-24|2010-10-21|Surgiceye Gmbh|Image formation apparatus and method for nuclear imaging| US20080319307A1|2007-06-19|2008-12-25|Ethicon Endo-Surgery, Inc.|Method for medical imaging using fluorescent nanoparticles| US20090003528A1|2007-06-19|2009-01-01|Sankaralingam Ramraj|Target location by tracking of imaging device| US9883818B2|2007-06-19|2018-02-06|Accuray Incorporated|Fiducial localization| JP2009028366A|2007-07-27|2009-02-12|Toshiba Corp|Ultrasonic diagnostic apparatus| US8155728B2|2007-08-22|2012-04-10|Ethicon Endo-Surgery, Inc.|Medical system, method, and storage medium concerning a natural orifice transluminal medical procedure| FR2920084B1|2007-08-24|2010-08-20|Endocontrol|IMAGING SYSTEM FOR MONITORING A SURGICAL TOOL IN AN OPERATIVE FIELD| FR2920085B1|2007-08-24|2012-06-15|Univ Grenoble 1|IMAGING SYSTEM FOR THREE-DIMENSIONAL OBSERVATION OF AN OPERATIVE FIELD| US8764737B2|2007-09-06|2014-07-01|Alcon Lensx, Inc.|Precise targeting of surgical photodisruption| GB2452546B|2007-09-07|2012-03-21|Sony Corp|Video processing system and method| US8905920B2|2007-09-27|2014-12-09|Covidien Lp|Bronchoscope adapter and method| US8934604B2|2007-09-28|2015-01-13|Kabushiki Kaisha Toshiba|Image display apparatus and X-ray diagnostic apparatus| US8108072B2|2007-09-30|2012-01-31|Intuitive Surgical Operations, Inc.|Methods and systems for robotic instrument tool tracking with adaptive fusion of kinematics information and image information| US8073528B2|2007-09-30|2011-12-06|Intuitive Surgical Operations, Inc.|Tool tracking systems, methods and computer products for image guided surgery| US8147503B2|2007-09-30|2012-04-03|Intuitive Surgical Operations Inc.|Methods of locating and tracking robotic instruments in robotic surgical systems| US9326667B2|2007-10-26|2016-05-03|C Change Surgical Llc|Anti-fogging and cleaning apparatus for medical scopes| US20090153548A1|2007-11-12|2009-06-18|Stein Inge Rabben|Method and system for slice alignment in diagnostic imaging systems| CN101877996B|2007-11-21|2014-10-15|美国医软科技公司|Method and system for interactive percutaneous pre-operation surgical planning| US20160058521A1|2007-11-21|2016-03-03|Edda Technology, Inc.|Method and system for adjusting interactive 3d treatment zone for percutaneous treatment| JP5416900B2|2007-11-22|2014-02-12|株式会社東芝|Ultrasonic diagnostic apparatus and puncture support control program| US8610965B2|2007-11-26|2013-12-17|Optelec Development B.V.|Reproduction device, assembly of a reproductive device and an indication body, and a method for reproducing an image portion| US20090137893A1|2007-11-27|2009-05-28|University Of Washington|Adding imaging capability to distal tips of medical tools, catheters, and conduits| US10206821B2|2007-12-20|2019-02-19|Acclarent, Inc.|Eustachian tube dilation balloon with ventilation path| US8103327B2|2007-12-28|2012-01-24|Rhythmia Medical, Inc.|Cardiac mapping catheter| WO2009093146A1|2008-01-24|2009-07-30|Koninklijke Philips Electronics N.V.|Interactive image segmentation| GB0803644D0|2008-02-28|2008-04-02|Rapiscan Security Products Inc|Scanning systems| GB0803641D0|2008-02-28|2008-04-02|Rapiscan Security Products Inc|Scanning systems| JP5291955B2|2008-03-10|2013-09-18|富士フイルム株式会社|Endoscopy system| US8182432B2|2008-03-10|2012-05-22|Acclarent, Inc.|Corewire design and construction for medical devices| JP5561458B2|2008-03-18|2014-07-30|国立大学法人浜松医科大学|Surgery support system| US8538509B2|2008-04-02|2013-09-17|Rhythmia Medical, Inc.|Intracardiac tracking system| US9575140B2|2008-04-03|2017-02-21|Covidien Lp|Magnetic interference detection system and method| GB0809110D0|2008-05-20|2008-06-25|Rapiscan Security Products Inc|Gantry scanner systems| WO2009147671A1|2008-06-03|2009-12-10|Superdimension Ltd.|Feature-based registration method| US8218847B2|2008-06-06|2012-07-10|Superdimension, Ltd.|Hybrid registration method| US20090312629A1|2008-06-13|2009-12-17|Inneroptic Technology Inc.|Correction of relative tracking errors based on a fiducial| US8560969B2|2008-06-26|2013-10-15|Landmark Graphics Corporation|Systems and methods for imaging operations data in a three-dimensional image| US8864652B2|2008-06-27|2014-10-21|Intuitive Surgical Operations, Inc.|Medical robotic system providing computer generated auxiliary views of a camera instrument for controlling the positioning and orienting of its tip| US10258425B2|2008-06-27|2019-04-16|Intuitive Surgical Operations, Inc.|Medical robotic system providing an auxiliary view of articulatable instruments extending out of a distal end of an entry guide| US9089256B2|2008-06-27|2015-07-28|Intuitive Surgical Operations, Inc.|Medical robotic system providing an auxiliary view including range of motion limitations for articulatable instruments extending out of a distal end of an entry guide| US8932207B2|2008-07-10|2015-01-13|Covidien Lp|Integrated multi-functional endoscopic tool| MX2011001099A|2008-07-30|2011-03-15|Acclarent Inc|Paranasal ostium finder devices and methods.| US8165658B2|2008-09-26|2012-04-24|Medtronic, Inc.|Method and apparatus for positioning a guide relative to a base| CN102196768B|2008-10-23|2014-01-22|皇家飞利浦电子股份有限公司|Cardiac- and/or respiratory-gated image acquisition system and method for virtual anatomy enriched real-time 2D imaging in interventional radiofrequency ablation or pacemaker placement procedures| US8137343B2|2008-10-27|2012-03-20|Rhythmia Medical, Inc.|Tracking system using field mapping| US8858436B2|2008-11-12|2014-10-14|Sonosite, Inc.|Systems and methods to identify interventional instruments| US8956296B2|2008-11-24|2015-02-17|Fujifilm Sonosite, Inc.|Systems and methods for active optimized spatio-temporal sampling| US8175681B2|2008-12-16|2012-05-08|Medtronic Navigation Inc.|Combination of electromagnetic and electropotential localization| US20100241155A1|2009-03-20|2010-09-23|Acclarent, Inc.|Guide system with suction| US8435290B2|2009-03-31|2013-05-07|Acclarent, Inc.|System and method for treatment of non-ventilating middle ear by providing a gas pathway through the nasopharynx| US20110178395A1|2009-04-08|2011-07-21|Carl Zeiss Surgical Gmbh|Imaging method and system| US8611984B2|2009-04-08|2013-12-17|Covidien Lp|Locatable catheter| US8355554B2|2009-04-14|2013-01-15|Sonosite, Inc.|Systems and methods for adaptive volume imaging| GB0906461D0|2009-04-15|2009-05-20|Siemens Medical Solutions|Partial volume correction via smoothing at viewer| US9398862B2|2009-04-23|2016-07-26|Rhythmia Medical, Inc.|Multi-electrode mapping system| US8571647B2|2009-05-08|2013-10-29|Rhythmia Medical, Inc.|Impedance based anatomy generation| US8103338B2|2009-05-08|2012-01-24|Rhythmia Medical, Inc.|Impedance based anatomy generation| CN102625669B|2009-06-08|2015-09-23|核磁共振成像介入技术有限公司|Can follow the tracks of and generate the interventional systems of the MRI guiding of the dynamic and visual of device in flexible body closely in real time| WO2010148088A2|2009-06-16|2010-12-23|Surgivision, Inc.|Mri-guided devices and mri-guided interventional systems that can track and generate dynamic visualizations of the devices in near real time| US9492927B2|2009-08-15|2016-11-15|Intuitive Surgical Operations, Inc.|Application of force feedback on an input device to urge its operator to command an articulated instrument to a preferred pose| US8903546B2|2009-08-15|2014-12-02|Intuitive Surgical Operations, Inc.|Smooth control of an articulated instrument across areas with different work space conditions| US9084623B2|2009-08-15|2015-07-21|Intuitive Surgical Operations, Inc.|Controller assisted reconfiguration of an articulated instrument during movement into and out of an entry guide| US8494613B2|2009-08-31|2013-07-23|Medtronic, Inc.|Combination localization system| US8494614B2|2009-08-31|2013-07-23|Regents Of The University Of Minnesota|Combination localization system| US8437538B2|2009-09-29|2013-05-07|Peking University|Volumetric image data processing| WO2011048047A1|2009-10-19|2011-04-28|Siemens Aktiengesellschaft|Hollow needle positioning system| US8819591B2|2009-10-30|2014-08-26|Accuray Incorporated|Treatment planning in a virtual environment| WO2011053921A2|2009-10-30|2011-05-05|The Johns Hopkins University|Visual tracking and annotation of clinically important anatomical landmarks for surgical interventions| KR20120093977A|2009-11-13|2012-08-23|이마그노시스 가부시키가이샤|Medical three-dimensional image display-orientation adjustment device and adjustment program| US9492322B2|2009-11-16|2016-11-15|Alcon Lensx, Inc.|Imaging surgical target tissue by nonlinear scanning| US9826958B2|2009-11-27|2017-11-28|QView, INC|Automated detection of suspected abnormalities in ultrasound breast images| US8348831B2|2009-12-15|2013-01-08|Zhejiang University|Device and method for computer simulated marking targeting biopsy| US8265364B2|2010-02-05|2012-09-11|Alcon Lensx, Inc.|Gradient search integrated with local imaging in laser surgical systems| US8918211B2|2010-02-12|2014-12-23|Intuitive Surgical Operations, Inc.|Medical robotic system providing sensory feedback indicating a difference between a commanded state and a preferred pose of an articulated instrument| US8414564B2|2010-02-18|2013-04-09|Alcon Lensx, Inc.|Optical coherence tomographic system for ophthalmic surgery| JP5421828B2|2010-03-17|2014-02-19|富士フイルム株式会社|Endoscope observation support system, endoscope observation support device, operation method thereof, and program| US7978742B1|2010-03-24|2011-07-12|Corning Incorporated|Methods for operating diode lasers| CA2797302C|2010-04-28|2019-01-15|Ryerson University|System and methods for intraoperative guidance feedback| JP2013530028A|2010-05-04|2013-07-25|パスファインダーセラピューティクス,インコーポレイテッド|System and method for abdominal surface matching using pseudo features| US8694074B2|2010-05-11|2014-04-08|Rhythmia Medical, Inc.|Electrode displacement determination| FR2960332B1|2010-05-21|2013-07-05|Gen Electric|METHOD OF PROCESSING RADIOLOGICAL IMAGES TO DETERMINE A 3D POSITION OF A NEEDLE.| US8398236B2|2010-06-14|2013-03-19|Alcon Lensx, Inc.|Image-guided docking for ophthalmic surgical systems| US10582834B2|2010-06-15|2020-03-10|Covidien Lp|Locatable expandable working channel and method| US9569891B2|2010-07-16|2017-02-14|Tyoterveyslaitos|Method, an apparatus and an arrangement for visualizing information| US8435033B2|2010-07-19|2013-05-07|Rainbow Medical Ltd.|Dental navigation techniques| JP5486432B2|2010-07-28|2014-05-07|富士フイルム株式会社|Image processing apparatus, operating method thereof, and program| DE102010039289A1|2010-08-12|2012-02-16|Leica MicrosystemsAg|microscope system| US9532708B2|2010-09-17|2017-01-03|Alcon Lensx, Inc.|Electronically controlled fixation light for ophthalmic imaging systems| US9155492B2|2010-09-24|2015-10-13|Acclarent, Inc.|Sinus illumination lightwire device| DE102010042372A1|2010-10-13|2012-04-19|Kuka Laboratories Gmbh|Method for creating a medical image and medical workstation| US8867804B2|2010-11-08|2014-10-21|Cranial Technologies, Inc.|Method and apparatus for automatically generating trim lines for cranial remodeling devices| US20120190970A1|2010-11-10|2012-07-26|Gnanasekar Velusamy|Apparatus and method for stabilizing a needle| US8929624B2|2010-11-26|2015-01-06|General Electric Company|Systems and methods for comparing different medical images to analyze a structure-of-interest| US9277872B2|2011-01-13|2016-03-08|Rhythmia Medical, Inc.|Electroanatomical mapping| US9002442B2|2011-01-13|2015-04-07|Rhythmia Medical, Inc.|Beat alignment and selection for cardiac mapping| US9308050B2|2011-04-01|2016-04-12|Ecole Polytechnique Federale De Lausanne |Robotic system and method for spinal and other surgeries| US8932063B2|2011-04-15|2015-01-13|Ams Research Corporation|BPH laser ablation simulation| US8459794B2|2011-05-02|2013-06-11|Alcon Lensx, Inc.|Image-processor-controlled misalignment-reduction for ophthalmic systems| US9622913B2|2011-05-18|2017-04-18|Alcon Lensx, Inc.|Imaging-controlled laser surgical system| JP5501290B2|2011-05-23|2014-05-21|富士フイルム株式会社|Image processing apparatus, radiographic image capturing system, and image processing program| US9218933B2|2011-06-09|2015-12-22|Rapidscan Systems, Inc.|Low-dose radiographic imaging system| CN106913366B|2011-06-27|2021-02-26|内布拉斯加大学评议会|On-tool tracking system and computer-assisted surgery method| US9498231B2|2011-06-27|2016-11-22|Board Of Regents Of The University Of Nebraska|On-board tool tracking system and methods of computer assisted surgery| US10105149B2|2013-03-15|2018-10-23|Board Of Regents Of The University Of Nebraska|On-board tool tracking system and methods of computer assisted surgery| DE102011078212B4|2011-06-28|2017-06-29|Scopis Gmbh|Method and device for displaying an object| JP6054089B2|2011-08-19|2016-12-27|東芝メディカルシステムズ株式会社|Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program| US9606209B2|2011-08-26|2017-03-28|Kineticor, Inc.|Methods, systems, and devices for intra-scan motion correction| US8398238B1|2011-08-26|2013-03-19|Alcon Lensx, Inc.|Imaging-based guidance system for ophthalmic docking using a location-orientation analysis| RU2606453C2|2011-12-03|2017-01-10|Конинклейке Филипс Н.В.|Automatic depth scrolling and orientation adjustment for semi-automated path planning| US9066784B2|2011-12-19|2015-06-30|Alcon Lensx, Inc.|Intra-surgical optical coherence tomographic imaging of cataract procedures| US9023016B2|2011-12-19|2015-05-05|Alcon Lensx, Inc.|Image processor for intra-surgical optical coherence tomographic imaging of laser cataract procedures| DE102011121708A1|2011-12-20|2013-06-20|Surgiceye Gmbh|Image generation apparatus and method for nuclear imaging| JP5797124B2|2012-01-31|2015-10-21|富士フイルム株式会社|Surgery support device, surgery support method, and surgery support program| WO2013126659A1|2012-02-22|2013-08-29|Veran Medical Technologies, Inc.|Systems, methods, and devices for four dimensional soft tissue navigation| US9700276B2|2012-02-28|2017-07-11|Siemens Healthcare Gmbh|Robust multi-object tracking using sparse appearance representation and online sparse appearance dictionary update| US9782147B2|2012-03-06|2017-10-10|Analogic Corporation|Apparatus and methods for localization and relative positioning of a surgical instrument| EP2822472A4|2012-03-07|2016-05-25|Ziteo Inc|Methods and systems for tracking and guiding sensors and instruments| EP2829218B1|2012-03-17|2017-05-03|Waseda University|Image completion system for in-image cutoff region, image processing device, and program therefor| KR101374189B1|2012-04-25|2014-03-13|한양대학교 에리카산학협력단|Navigation system for surgery| US9439623B2|2012-05-22|2016-09-13|Covidien Lp|Surgical planning system and navigation system| US9498182B2|2012-05-22|2016-11-22|Covidien Lp|Systems and methods for planning and navigation| US8750568B2|2012-05-22|2014-06-10|Covidien Lp|System and method for conformal ablation planning| US9439622B2|2012-05-22|2016-09-13|Covidien Lp|Surgical navigation system| US9439627B2|2012-05-22|2016-09-13|Covidien Lp|Planning system and navigation system for an ablation procedure| JP5673607B2|2012-05-30|2015-02-18|株式会社デンソー|Screen member and head-up display device| US20130328874A1|2012-06-06|2013-12-12|Siemens Medical Solutions Usa, Inc.|Clip Surface for Volume Rendering in Three-Dimensional Medical Imaging| WO2013192598A1|2012-06-21|2013-12-27|Excelsius Surgical, L.L.C.|Surgical robot platform| US9642606B2|2012-06-27|2017-05-09|Camplex, Inc.|Surgical visualization system| US9492065B2|2012-06-27|2016-11-15|Camplex, Inc.|Surgical retractor with video cameras| US9782159B2|2013-03-13|2017-10-10|Camplex, Inc.|Surgical visualization systems| US9489752B2|2012-11-21|2016-11-08|General Electric Company|Ordered subsets with momentum for X-ray CT image reconstruction| US10588586B2|2013-01-16|2020-03-17|Siemens Healthcare Gmbh|Cardiac analysis based on vessel characteristics| US9717461B2|2013-01-24|2017-08-01|Kineticor, Inc.|Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan| US10327708B2|2013-01-24|2019-06-25|Kineticor, Inc.|Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan| US9305365B2|2013-01-24|2016-04-05|Kineticor, Inc.|Systems, devices, and methods for tracking moving targets| KR102167245B1|2013-01-31|2020-10-19|라피스캔 시스템스, 인코포레이티드|Portable security inspection system| US9782141B2|2013-02-01|2017-10-10|Kineticor, Inc.|Motion tracking system for real time adaptive motion compensation in biomedical imaging| US10507066B2|2013-02-15|2019-12-17|Intuitive Surgical Operations, Inc.|Providing information of tools by filtering image areas adjacent to or on displayed images of the tools| JP6412020B2|2013-02-26|2018-10-24|アキュレイ インコーポレイテッド|Electromagnetic multi-leaf collimator| US9629684B2|2013-03-15|2017-04-25|Acclarent, Inc.|Apparatus and method for treatment of ethmoid sinusitis| AU2014231344B2|2013-03-15|2018-10-04|Synaptive Medical Inc.|Systems and methods for navigation and simulation of minimally invasive therapy| US9433437B2|2013-03-15|2016-09-06|Acclarent, Inc.|Apparatus and method for treatment of ethmoid sinusitis| CN105324067B|2013-05-06|2017-10-24|波士顿科学医学有限公司|In real time or playback electric physiological data visualization during nearest bouncing characteristic it is continuously display| CN105228510B|2013-05-14|2018-12-14|波士顿科学医学有限公司|The expression and identification of the activity pattern of vector field are used during electrophysiology mapping| DE102013211055B3|2013-06-13|2014-09-18|Scopis Gmbh|Adapter for receiving a medical device and a position detection system| TWI518368B|2013-09-11|2016-01-21|財團法人工業技術研究院|Virtual image display apparatus| WO2015039302A1|2013-09-18|2015-03-26|Shenzhen Mindray Bio-Medical Electronics Co., Ltd|Method and system for guided ultrasound image acquisition| JP6521982B2|2013-09-20|2019-05-29|キャンプレックス インコーポレイテッド|Surgical visualization system and display| US10881286B2|2013-09-20|2021-01-05|Camplex, Inc.|Medical apparatus for use with a surgical tubular retractor| WO2015046152A1|2013-09-27|2015-04-02|オリンパスメディカルシステムズ株式会社|Endoscopy system| US9283048B2|2013-10-04|2016-03-15|KB Medical SA|Apparatus and systems for precise guidance of surgical tools| US9687166B2|2013-10-14|2017-06-27|Boston Scientific Scimed, Inc.|High resolution cardiac mapping electrode array catheter| WO2015107099A1|2014-01-15|2015-07-23|KB Medical SA|Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery| US10039605B2|2014-02-11|2018-08-07|Globus Medical, Inc.|Sterile handle for controlling a robotic surgical system from a sterile field| EP3104801B1|2014-02-11|2019-03-13|Koninklijke Philips N.V.|System for spatial visualization of internal mammary artery during minimally invasive bypass surgery| US10004462B2|2014-03-24|2018-06-26|Kineticor, Inc.|Systems, methods, and devices for removing prospective motion correction from medical imaging scans| GB2524955A|2014-04-01|2015-10-14|Scopis Gmbh|Method for cell envelope segmentation and visualisation| US20150305612A1|2014-04-23|2015-10-29|Mark Hunter|Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter| US20150305650A1|2014-04-23|2015-10-29|Mark Hunter|Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue| WO2015162256A1|2014-04-24|2015-10-29|KB Medical SA|Surgical instrument holder for use with a robotic surgical system| EP3142587B1|2014-05-16|2019-07-10|Koninklijke Philips N.V.|Reconstruction-free automatic multi-modality ultrasound registration| CN106413540A|2014-06-03|2017-02-15|波士顿科学医学有限公司|Electrode assembly having an atraumatic distal tip| CN106413539A|2014-06-04|2017-02-15|波士顿科学医学有限公司|Electrode assembly| DE102014108055A1|2014-06-06|2015-12-17|Surgiceye Gmbh|Apparatus for detecting a nuclear radiation distribution| US10105186B2|2014-06-09|2018-10-23|The Johns Hopkins University|Virtual rigid body optical tracking system and method| US10952593B2|2014-06-10|2021-03-23|Covidien Lp|Bronchoscope adapter| US10828120B2|2014-06-19|2020-11-10|Kb Medical, Sa|Systems and methods for performing minimally invasive surgery| CN106232010B|2014-07-02|2020-03-31|柯惠有限合伙公司|System and method for detecting trachea| WO2016004310A2|2014-07-02|2016-01-07|Covidien Lp|Real-time automatic registration feedback| US9603668B2|2014-07-02|2017-03-28|Covidien Lp|Dynamic 3D lung map view for tool navigation inside the lung| CA2953146A1|2014-07-02|2016-01-07|Covidien Lp|System and method for segmentation of lung| US9633431B2|2014-07-02|2017-04-25|Covidien Lp|Fluoroscopic pose estimation| US9770216B2|2014-07-02|2017-09-26|Covidien Lp|System and method for navigating within the lung| US20160000414A1|2014-07-02|2016-01-07|Covidien Lp|Methods for marking biopsy location| US9754367B2|2014-07-02|2017-09-05|Covidien Lp|Trachea marking| GB201511341D0|2014-07-07|2015-08-12|Prendergast Kenneth F|Apparatus adapted for use in clinically invasive procedures| US10772489B2|2014-07-09|2020-09-15|Acclarent, Inc.|Guidewire navigation for sinuplasty| US10463242B2|2014-07-09|2019-11-05|Acclarent, Inc.|Guidewire navigation for sinuplasty| US10765438B2|2014-07-14|2020-09-08|KB Medical SA|Anti-skid surgical instrument for use in preparing holes in bone tissue| WO2016008880A1|2014-07-14|2016-01-21|KB Medical SA|Anti-skid surgical instrument for use in preparing holes in bone tissue| US20160015469A1|2014-07-17|2016-01-21|Kyphon Sarl|Surgical tissue recognition and navigation apparatus and method| EP3188660A4|2014-07-23|2018-05-16|Kineticor, Inc.|Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan| US10643371B2|2014-08-11|2020-05-05|Covidien Lp|Treatment procedure planning system and method| US10617401B2|2014-11-14|2020-04-14|Ziteo, Inc.|Systems for localization of targets inside a body| JP6731920B2|2014-12-02|2020-07-29|カーベー メディカル エスアー|Robot-assisted volume removal during surgery| WO2016090336A1|2014-12-05|2016-06-09|Camplex, Inc.|Surgical visualization systems and displays| GB201501157D0|2015-01-23|2015-03-11|Scopis Gmbh|Instrument guidance system for sinus surgery| US10013808B2|2015-02-03|2018-07-03|Globus Medical, Inc.|Surgeon head-mounted display apparatuses| EP3258872A1|2015-02-18|2017-12-27|KB Medical SA|Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique| CA2962652C|2015-03-17|2019-12-03|Synaptive MedicalInc.|Method and device for registering surgical images| WO2016154589A1|2015-03-25|2016-09-29|Camplex, Inc.|Surgical visualization systems and displays| US20160293057A1|2015-03-30|2016-10-06|Cae Inc.|Tracking system| US10426555B2|2015-06-03|2019-10-01|Covidien Lp|Medical instrument with sensor for use in a system and method for electromagnetic navigation| US9943247B2|2015-07-28|2018-04-17|The University Of Hawai'i|Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan| US10058394B2|2015-07-31|2018-08-28|Globus Medical, Inc.|Robot arm and methods of use| US10646298B2|2015-07-31|2020-05-12|Globus Medical, Inc.|Robot arm and methods of use| US10080615B2|2015-08-12|2018-09-25|Globus Medical, Inc.|Devices and methods for temporary mounting of parts to bone| CN107920740B|2015-08-20|2020-11-24|波士顿科学医学有限公司|Flexible electrodes for cardiac sensing and methods of making the same| JP6894431B2|2015-08-31|2021-06-30|ケービー メディカル エスアー|Robotic surgical system and method| US10034716B2|2015-09-14|2018-07-31|Globus Medical, Inc.|Surgical robotic systems and methods thereof| US10986990B2|2015-09-24|2021-04-27|Covidien Lp|Marker placement| WO2017053927A1|2015-09-26|2017-03-30|Boston Scientific Scimed Inc.|Systems and methods for anatomical shell editing| US10271757B2|2015-09-26|2019-04-30|Boston Scientific Scimed Inc.|Multiple rhythm template monitoring| US10405766B2|2015-09-26|2019-09-10|Boston Scientific Scimed, Inc.|Method of exploring or mapping internal cardiac structures| US10271758B2|2015-09-26|2019-04-30|Boston Scientific Scimed, Inc.|Intracardiac EGM signals for beat matching and acceptance| US9771092B2|2015-10-13|2017-09-26|Globus Medical, Inc.|Stabilizer wheel assembly and methods of use| US10709352B2|2015-10-27|2020-07-14|Covidien Lp|Method of using lung airway carina locations to improve ENB registration| US9962134B2|2015-10-28|2018-05-08|Medtronic Navigation, Inc.|Apparatus and method for maintaining image quality while minimizing X-ray dosage of a patient| US9947091B2|2015-11-16|2018-04-17|Biosense WebsterLtd.|Locally applied transparency for a CT image| US10716515B2|2015-11-23|2020-07-21|Kineticor, Inc.|Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan| US10966798B2|2015-11-25|2021-04-06|Camplex, Inc.|Surgical visualization systems and displays| US11172895B2|2015-12-07|2021-11-16|Covidien Lp|Visualization, navigation, and planning with electromagnetic navigation bronchoscopy and cone beam computed tomography integrated| US10842453B2|2016-02-03|2020-11-24|Globus Medical, Inc.|Portable medical imaging system| US10448910B2|2016-02-03|2019-10-22|Globus Medical, Inc.|Portable medical imaging system| US11058378B2|2016-02-03|2021-07-13|Globus Medical, Inc.|Portable medical imaging system| US10117632B2|2016-02-03|2018-11-06|Globus Medical, Inc.|Portable medical imaging system with beam scanning collimator| US10866119B2|2016-03-14|2020-12-15|Globus Medical, Inc.|Metal detector for detecting insertion of a surgical device into a hollow tube| US10413366B2|2016-03-16|2019-09-17|Synaptive MedicalInc.|Trajectory guidance alignment system and methods| CN105852970B|2016-04-29|2019-06-14|北京柏惠维康科技有限公司|Neurosurgical Robot navigation positioning system and method| GB2565510B|2016-05-11|2021-07-21|Sial Aisha|Phantom to determine positional and angular navigation system error| SE542045C2|2016-05-15|2020-02-18|Ortoma Ab|Calibration object, system, and method calibrating location of instrument in a navigation system| US10478254B2|2016-05-16|2019-11-19|Covidien Lp|System and method to access lung tissue| JP2019523664A|2016-05-23|2019-08-29|マコ サージカル コーポレーション|System and method for identifying and tracking physical objects during robotic surgical procedures| CN107456278B|2016-06-06|2021-03-05|北京理工大学|Endoscopic surgery navigation method and system| US10251612B2|2016-08-08|2019-04-09|Carestream Health, Inc.|Method and system for automatic tube current modulation| WO2018027793A1|2016-08-11|2018-02-15|中国科学院深圳先进技术研究院|Method and system for visually localizing brain functional structure in craniotomy| US11039893B2|2016-10-21|2021-06-22|Globus Medical, Inc.|Robotic surgical systems| US10418705B2|2016-10-28|2019-09-17|Covidien Lp|Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same| US10751126B2|2016-10-28|2020-08-25|Covidien Lp|System and method for generating a map for electromagnetic navigation| US10638952B2|2016-10-28|2020-05-05|Covidien Lp|Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system| US10722311B2|2016-10-28|2020-07-28|Covidien Lp|System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map| US10446931B2|2016-10-28|2019-10-15|Covidien Lp|Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same| US10792106B2|2016-10-28|2020-10-06|Covidien Lp|System for calibrating an electromagnetic navigation system| US10517505B2|2016-10-28|2019-12-31|Covidien Lp|Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system| US10615500B2|2016-10-28|2020-04-07|Covidien Lp|System and method for designing electromagnetic navigation antenna assemblies| TWI624243B|2016-12-15|2018-05-21|神農資訊股份有限公司|Surgical navigation system and instrument guiding method thereof| EP3360502A3|2017-01-18|2018-10-31|KB Medical SA|Robotic navigation of robotic surgical systems| JP2018114280A|2017-01-18|2018-07-26|ケービー メディカル エスアー|Universal instrument guide for robotic surgical system, surgical instrument system, and method of using them| EP3351202B1|2017-01-18|2021-09-08|KB Medical SA|Universal instrument guide for robotic surgical systems| US11071594B2|2017-03-16|2021-07-27|KB Medical SA|Robotic navigation of robotic surgical systems| US10918455B2|2017-05-08|2021-02-16|Camplex, Inc.|Variable light source| US11135015B2|2017-07-21|2021-10-05|Globus Medical, Inc.|Robot surgical platform| CN107689045B|2017-09-06|2021-06-29|艾瑞迈迪医疗科技(北京)有限公司|Image display method, device and system for endoscope minimally invasive surgery navigation| US11219489B2|2017-10-31|2022-01-11|Covidien Lp|Devices and systems for providing sensors in parallel with medical tools| JP6778242B2|2017-11-09|2020-10-28|グローバス メディカル インコーポレイティッド|Surgical robot systems for bending surgical rods, and related methods and equipment| US11134862B2|2017-11-10|2021-10-05|Globus Medical, Inc.|Methods of selecting surgical implants and related devices| US10426424B2|2017-11-21|2019-10-01|General Electric Company|System and method for generating and performing imaging protocol simulations| US11224392B2|2018-02-01|2022-01-18|Covidien Lp|Mapping disease spread| US20190254753A1|2018-02-19|2019-08-22|Globus Medical, Inc.|Augmented reality navigation systems for use with robotic surgical systems and methods of their use| US10573023B2|2018-04-09|2020-02-25|Globus Medical, Inc.|Predictive visualization of medical imaging scanner component movement| US11204677B2|2018-10-22|2021-12-21|Acclarent, Inc.|Method for real time update of fly-through camera placement| EP3897346A1|2019-01-23|2021-10-27|Proprio, Inc.|Aligning pre-operative scan images to real-time operative images for a mediated-reality view of a surgical site| US11045179B2|2019-05-20|2021-06-29|Global Medical Inc|Robot-mounted retractor system| US10856841B1|2020-01-24|2020-12-08|King Saud University|Ultrasonic imaging probe| US11207150B2|2020-02-19|2021-12-28|Globus Medical, Inc.|Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment| US11153555B1|2020-05-08|2021-10-19|Globus Medical Inc.|Extended reality headset camera system for computer assisted navigation in surgery|
法律状态:
2006-08-22| FPAY| Fee payment|Year of fee payment: 4 | 2007-10-31| AS| Assignment|Owner name: SHAHIDI, RAMIN, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY;REEL/FRAME:020123/0450 Effective date: 20071031 | 2007-11-30| AS| Assignment|Owner name: SHAHIDI, RAMIN, CALIFORNIA Free format text: CHANGE OF ASSIGNEE ADDRESS;ASSIGNOR:SHAHIDI, RAMIN;REEL/FRAME:020184/0435 Effective date: 20071130 | 2010-04-13| AS| Assignment|Owner name: CALIFORNIA INSTITUTE OF COMPUTER ASSISTED SURGERY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHAHIDI, RAMIN;REEL/FRAME:024225/0083 Effective date: 20100408 | 2010-10-11| REMI| Maintenance fee reminder mailed| 2011-03-04| REIN| Reinstatement after maintenance fee payment confirmed| 2011-03-04| LAPS| Lapse for failure to pay maintenance fees| 2011-04-26| FP| Expired due to failure to pay maintenance fee|Effective date: 20110304 | 2013-02-25| PRDP| Patent reinstated due to the acceptance of a late maintenance fee|Effective date: 20130228 | 2013-02-28| FPAY| Fee payment|Year of fee payment: 8 | 2013-02-28| SULP| Surcharge for late payment| 2013-02-28| STCF| Information on status: patent grant|Free format text: PATENTED CASE | 2014-10-10| REMI| Maintenance fee reminder mailed| 2015-03-04| SULP| Surcharge for late payment|Year of fee payment: 11 | 2015-03-04| FPAY| Fee payment|Year of fee payment: 12 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US2066496P| true| 1996-06-28|1996-06-28|| US88428997A| true| 1997-06-27|1997-06-27|| US09/411,363|US6167296A|1996-06-28|1999-09-30|Method for volumetric image navigation| US09/747,463|US6591130B2|1996-06-28|2000-12-22|Method of image-enhanced endoscopy at a patient site| US09/777,777|US6529758B2|1996-06-28|2001-02-05|Method and apparatus for volumetric image navigation|US09/777,777| US6529758B2|1996-06-28|2001-02-05|Method and apparatus for volumetric image navigation| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|